## IACS Research Day Faculty Talks: Xiangmin (Jim) Jiao

>>All right. Yes, I’m Jim Jiao in the Department of Mathematics

and Statistics. So I’ll be talking about “Variational Matters

and Algorithms”. But it sounds boring so I changed the title

to “Fighting Crimes and Breaking Barriers”. And the crime we are fighting are variational

crimes in mathematics, and barriers that we are breaking are basically doing the impossible

that shows some theorem and say that this is [inaudible], because we are trying to break

over that by showing that the theorem was actually wrong. So in my research group, we work on basically

the three circles of computational science that Robert showed. And my students are working on [inaudible]

algorithms and the codes for solving the algebra problems [inaudible], and try to get to the

holy grail of linear algorithms works with [inaudible] black box solvers that is near

optimal and robust. And Oliver and Joshua are working on these

problems. And other students, including Joshua and Oliver,

they are working on computational fluid dynamics, [inaudible], climate modeling, doing [inaudible]

physics. So when I’m not teaching or I’m not meeting

with the students, I spend my time thinking about the mathematics foundations of numerical

matters for computational science, and in particular for partial differential equations

in terms of what are the optimal conditions, what are the limitations of the most commonly

used methods, and are there hybrid methods that you overcome these limitations while

maintaining the [inaudible]? So that’s basically the overview of the activities

of my group. So here is a brief diagram of the history

of numerical methods, dating back to like 1700s, starting with [inaudible] expansions. And [inaudible] I think that’s the godfather

of numerical analysis, because in numerical analysis textbooks like every other page you

have [inaudible] expansions to do the analysis. And well, from there and now as we saw, all

kinds of challenging problems in computational science through the dynamics or the mechanics,

and many matters are based on you can find different methods for [inaudible] which were

mostly developed in the 1920s. In terms of theoretic foundations, we have

conditions for the theorem and so on. And finally, volume and [inaudible] those

were developed like in 1950s, ’60s, and they got really popular in the 1970s. And their theory were mostly, right, based

on [inaudible] space, functional analysis, all those things. And after that time, numerical linear [inaudible]

was really powerful and it unified many of the mathematical — well you can use it to

analyze many of the theory behind your numerical analysis, except that they are not really

compatible [inaudible]. They have a set of rules. And they are also like [inaudible] functions,

and so on. So basically, there are different branches

of mathematics that are the foundations for the new mathematics we are using nowadays. And the difference mostly based on [inaudible],

functions analysis. And the [inaudible] are 99% based on the Huber

space analysis, and is incompatible with anything else — well pretty much; I cannot say exactly. So anyway, there have been a lot of work done

on mathematic foundations and for [inaudible] matters. The 1970s was the golden era. And even the mathematics foundation well established

so why am I — you think about these problems; started to think about the successes we already

have so far. So [inaudible] method in particular they really

help — successful in solving [inaudible] elasticity, and so on. And in the class textbook, [inaudible] text,

and citing works by like [inaudible]. And it claims that actually [inaudible] boundaries

and using [inaudible], and that’s in the textbook; that’s the [inaudible] textbook. And now here is the first grade math question,

“[Inaudible],” which means that P is equal to 2. “Based on this claim, what is the order you

should be getting for these two problems?” Solving this [inaudible] on these two domains,

these curve boundaries, why this has concave, boundary one has convex boundary,” first grade

math question. [Laughter] So it’s P plus one, that’s three. Okay, any other guesses?>>[Inaudible] can you represent the boundaries?>>The boundary is represented [inaudible]

and parametric elements, which is [inaudible] to represent it. And the theory say it should be [inaudible]. And well if you look at some analysis, they

claim that, well, convex, concave that will make a difference and so on. But here’s the thing, this curve is for the

political elements [inaudible] convergence rate. For concave and convex domains, it does not

matter whether it’s convex or concave. And the theory — the classical textbook and

the textbook analysis were incorrect. It’s second order, not third order accurate. So Matt mentioned that they have this problem

that they cannot derive something, now we have textbooks that were like 50 years, and

it’s incorrect. I don’t know which one is worse. So basically, what has gone wrong? Well, have has gone wrong is that [inaudible]

implicit assumptions using the classical analysis. I’m not going to go into details. But basically, it has in places assumption

of the continuity of the functions so that you can’t apply the theorems from functional

analysis, but the assumption is broken; you have discontinuous [inaudible] directions. And this assumption, well, it’s hidden, and

it is not analyzed because of the so-called variation [inaudible]. So recently, if you work on PDs — you probably

have heard of iso-geometric analysis, it’s really, really popular with [inaudible] followers

developed by [inaudible], and it was meant to overcome these [inaudible] issues without

— well there was no analysis, basically they found that this work. But exact geometry is only sufficient when

there is another necessary condition. And it makes mention [inaudible] complicated. So understanding the root cause of the problem

will make our lives much simpler in terms of better algorithms, faster, simpler. Here on the last slide is the mathematical

solution, really established already? Well, here is another quote from the textbook

that the optimal convergence rate when you have the repeat it’s [inaudible], so there

was a — this is optimal bound. You should get this on the previous picture,

and on this picture, this is the best you can get. And there are also constraints about the angle

conditions for the matches for many [inaudible] matters, maximum angle, minimum angle. And they have been used for decades and decades

for all the researchers [inaudible] matters and [inaudible] metrics. So here is a question for you. “Between these two matches — these correct

matches, which one is better? Which one will give you better accuracy?” Looks like the left one gives you better angles;

maybe it will give you better accuracy. And — or it doesn’t matter because people

[inaudible] optimal. Here’s a surprise is the second match, which

is simple and looks bad from angles, it gives you the fourth [inaudible] convergence rate,

and almost two other [inaudible] better accuracy than the left one. So all the people spent all their lifetimes

developing like [inaudible] translation and all those things, well, sorry, the theory

was wrong in terms of guidelines. So basically, the classical analysis could

not extract, why, you’ve got first of all, accuracy. And it could not explain why mathematical

matches are better, which means that a lot of optimal matters claimed in the textbook

[inaudible] literature are directly aiming for suboptimal convergence rate. And so our goal is to develop this truly optimal

methods that are based on correct mathematic foundation, and to significantly reduce computational

climbs and increase accuracy. Okay, so in summary I’m just — as I told

you, we are trying to develop unified mathematical foundations, and that’s what I’m dreaming

about every day. And by integrating approximation theory, PD

theory, function analysis, differential computational geometry, linear [inaudible] — so basically,

the 300 years of the history in numerical methods and unify them so that there will

no longer be crimes when you combine together. And why is this important, because without

this understanding we cannot really develop truly optimal methods. And that’s what my group students are focusing

on, [inaudible] methods based on some mathematical foundation. Hopefully, I get them right. And finally, apply them to computation of

[inaudible] dynamics, climate modeling, and [inaudible]. I have the posters outside on like methods

for [inaudible] modeling, for [inaudible] transfer; and that’s it [inaudible]. [ Applause ]>>Jim, thanks very much.>>Thank you. Questions?>>Question, I know there are lots of people

using measures and goods in the audience.>>I have a question.>>Yes.>>In your last PowerPoint picture, you demonstrated

that the [inaudible] is better than the regular one.>>Actually, this one is better than this

one.>>Which one?>>This one.>>The symmetric one.>>Yes.>>Okay.>>Yes. Yes.>>My question was if you can easily [inaudible],

we need adoptive measures.>>Yes, you can. So that requires a very solid direct analysis

to cannibalize this. I call this “uniform”, or [inaudible] property

like [inaudible] background. And there is a [inaudible] so that you can

distort it slightly and you still get the super convergence. And that’s — yes that’s complicated.>>Okay. Yes, so I have a similar question, actually,

about this. So I have been using both irregular and regular

meshes, and I know that everything is nice with the regular meshes unless you have irregular

boundaries.>>Yes.>>Unless you have regular boundaries, you

have to go to regular mesh.>>Yes.>>And then the question is whether you want

to connect really irregular and regular mesh –>>Yes.>>Because then you can get [inaudible].>>That’s right. So this is this one here. The [inaudible] matters is confined by the

function analysis, and it cannot break out of function analysis. So to do this, you really have to have generalized

variational formulation that you break out the function analysis in near boundaries. So near boundaries are not in your problem. And functional analysis stresses that linear

problem. The theory is incorrect. That’s basically why with curve boundaries

you have — you lost this convergence. P plus 1 is gone. You adjust this to P when you still integrate,

because that’s the function analysis limiting factor. So the shorter answer is that you cannot really

do that with [inaudible] itself, you have to use some [inaudible] in your boundaries

to do it with nonlinear problems [inaudible] problem. Yes.>>Yes. Thanks very much.>>Thank you. [Applause]

## Leave a Reply