-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathmodels.tex
320 lines (279 loc) · 18 KB
/
models.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
\chapter{Model}
\label{ch:models}
In this chapter we will consider a generalization of the model considered by Dipierro et al.\
in~\cite{dipierro2020disconnectedness}, where they considered the external data \( E_0 \) as the
complement of a slab in \( \mathbb{R}^n \) of width \( 2M \) and the prescribed data \( \Omega \) as
the cylinder of radius \( 1 \) and height \( 2M \). They showed that for \( M \) big enough the
minimizer is disconnected which is consistent with the classical theory of minimal surfaces. When \(
M \) is small enough however, the minimizer is connected and even sticks to the boundary. The latter
being a unique property of nonlocal minimal surfaces. \\
Here we will show that for any external data \( E_0 \) such that
\begin{gather*}
E_R \coloneqq \{(x^\prime, x_n \mid \lvert x^\prime \rvert < 1, M < \lvert x_n \rvert < M + R) \} \subset E_0 \subset \{(x^\prime, x_n \mid \lvert x_n \rvert > M) \}
\end{gather*}
and prescribed data \( \Omega \coloneqq \{(x^\prime, x_n \mid \lvert x^\prime \rvert < 1, \lvert x_n
\rvert < M) \} \), for some \( M, R > 0 \) displays the same behavior as the model considered
in~\cite{dipierro2020disconnectedness}. That is, for \( M \) big enough the minimizer is
disconnected and for \( M \) small enough the minimizer is connected and sticks to the boundary. \\
For \( n \geq 2 \) consider any external data \( E_0 \) and prescribed data as above. The
\Cref{fig:101} illustrates the setting.
\begin{figure}[ht]
\centering
\def\svgscale{1}
\import{figures/model_general}{model_general_base.pdf_tex}
\caption{Basic setting of the model with external data \( E_0 \) and prescribed data \( \Omega \).}
\label{fig:101}
\end{figure}
We state the following two results, which we will prove afterward.
\begin{theorem}
\label{thm:101}
For \( E_0 \) and \( \Omega \) as above and any \( R > 0 \) there exists an \( M_0 \in (0, 1) \)
depending on the dimension, \( R \) and \( s \), such that for any \( M \in (0, M_0) \) the
minimizer \( E_M \) is given by \( E_M = E_0 \cup \Omega \).
\end{theorem}
\begin{theorem}
\label{thm:102}
For \( E_0 \) and \( \Omega \) as above and any \( R > 0 \) there exists an \( M_0 > 1 \)
depending on the dimension, \( R \) and \( s \), such that for any \( M > M_0 \) the minimizer \(
E_M \) is disconnected.
\end{theorem}
For the first proof, we will follow a similar construction as in~\cite{dipierro2020disconnectedness}. \\
In~\cite{caffarelli2009nonlocal} the authors have shown that nonlocal minimizer satisfy the
Euler-Lagrange equation in the viscosity sense, i.e.\ if \( E \) is a minimizer, there exists some
such that \( q \in \partial E \) and \( B_r (q + r \nu) \subset E \) for some \( r > 0 \) and unit
vector \( \nu \in \mathbb{R}^n \), then
\begin{gather*}
\int_{\mathbb{R}^n} \frac{\chi_{E^c}(y) - \chi_E (y)}{\lvert y - q\rvert^{n + s}} \dd{y} \geq 0. \tagged\label{eq:101}
\end{gather*}
In the proof we will assume that there exist a minimizer which is not \( E_0 \cup \Omega \). To
bring this assumption to a contradiction, we want to show that the left-hand side of \cref{eq:101}
is negative for \( M \) small enough. Thus, we have to construct some suitable ball such that we can
apply the Euler-Lagrange equation. We will construct this ball by sliding a ball of some suitable
radius down from \( e_n \) direction. If the minimizer is not \( E_0 \cup \Omega \), then at some
point the ball will touch the minimizer for any \( 0 < r < 1 \) and a point \( q \), then exists.
Then we will split the domain into four parts and estimate each part to get the contradiction.
To deal with the integration close to \( q \) we will follow the construction
in~\cite{caffarelli2009nonlocal} and reflect the ball at the touching point to cancel symmetric
parts.
\begin{proof}[Proof of \Cref{thm:101}]
Proof by contradiction. Assume the minimizer \( E_M \) is not \( E_0 \cup \Omega \). Then we can
slide a ball of some radius \( r \) down in \( e_n \) direction and at some point it will touch
the minimizer. We consider the ball \( B_r (t e_n) \). Since \( E_M \) is not \( E_0 \cup \Omega
\), there exists \( r_0 \in (0, 1) \) and \( t_0 > 0 \) such that \( \partial B_{r_0} (t_0 e_n)
\cap \partial E_M \neq \emptyset \) and \( B_{r_0} (t e_n) \subset E_M \) for all \( t > t_0 \),
see \Cref{fig:102}. In the following we define \( z \coloneqq t_0 e_n \).
\begin{figure}[ht]
\centering
\def\svgscale{1}
\import{figures/model_general}{model_general_balls.pdf_tex}
\caption{Sliding ball down and reflecting at the touching point \( q \).}
\label{fig:102}
\end{figure}
Since \( E_M \) is a minimizer it is also a viscosity solution of the Euler-Lagrange equation and
the inequality
\begin{gather*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c}(y) - \chi_{E_M}(y)}{\lvert y - q
\rvert^{n + s}} \dd{y} \geq 0 \tagged\label{eq:103}
\end{gather*}
holds, whereas \( q \in \partial B_{r_0} (z) \cap \partial E_M \), since \( B_{r_0} (z) \) is
an interior tangent ball to \( E_M \).
We will bring this to a contradiction by showing that the left-hand side is negative for \( M \)
small enough. We will split the domain up and estimate each part.
\begin{figure}[ht]
\centering
\def\svgscale{1}
\import{figures/model_general}{model_general_split.pdf_tex}
\caption{Splitting of the domain.}
\label{fig:103}
\end{figure}
We define the following sets, see \Cref{fig:103}:
\begin{align*}
A & \coloneqq \{(x^\prime, x_n) \mid \lvert x^\prime -q^\prime \rvert < 2, \lvert x_n - q_n \rvert < 2M \}, \\
B & \coloneqq \{(x^\prime, x_n) \mid \lvert x^\prime - q^\prime \rvert < 2, \lvert x_n - q_n \rvert < R \}, \\
C & \coloneqq E_M^c \setminus B, \\
D & \coloneqq E_M \setminus A,
\end{align*}
where \( R \) is chosen such that \( E_R \subset E_0 \), and we chose \( M \) such that \( 2M < R
\) for technical reasons, which will become clear later. \\
First let us assume that \( R \geq 2 \).
We start by estimating the integral over \( C \). Notice that \( C \subset E_M^c \) and \( C
\subset \{\lvert y \rvert \geq 2 \} \). Thus, we can bound
\begin{align*}
\int_C \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
= \int_C \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y}
\leq \int_{\lvert y \rvert \geq 2} \frac{1}{\lvert y \rvert^{n + s}} \dd{y}
\leq c(n, s) 2^{- s},
\end{align*}
where \( c(n, s) \) is some positive constant depending on the dimension and the parameter \( s
\). \\
Next we estimate the integral over \( D \). This part is the negative contribution of our integral, and we
want it to increase as we make \( M \) smaller. First notice that \( D \subset E_M \), thus the
integral is negative. To get an upper bound we can restrict the domain to something smaller. We
choose the ball \( B_M (q + 3Me_n) \) if \( q_n \) is negative or \( B_M (q - 3Me_n) \) if \( q_n
\) is positive, see \Cref{fig:104}. If needed we can also shift the ball closer to the origin by
shifting it by \( M \) in \( \frac{-q^\prime}{\lvert q^\prime \rvert} \) direction. Important here
is that the distance between the ball and \( q \) scales with \( M \). Finally, we multiply
the integral by \( \frac{1}{2} \), since it may be that not the whole ball is in \( D \) but since
we chose \( 2M < R \) half of the ball (the part closer to \( q \)) is in \( D \) for sure. Thus,
we have
\begin{figure}[ht]
\centering
\def\svgscale{1}
\import{figures/model_general}{model_general_mball.pdf_tex}
\caption{Restriction of the domain for the integral over \( D \).}
\label{fig:104}
\end{figure}
\begin{align*}
\int_D \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
= - \int_D \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y}
\leq - c(n) \int_{B_M (3Me_n)} \frac{1}{\lvert y \rvert^{n + s}} \dd{y}
\leq - c(n, s) M^{- s}.
\end{align*}
Finally, we estimate the integral over the rest of the domain \( S \coloneqq \mathbb{R}^n \setminus
(C \cup D) \). First notice that
\begin{align*}
\int_{S\cap B_{r_0} (q) \cap B_{r_0} (z)} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y}
= \int_{S\cap B_{r_0} (q) \cap B_{r_0} (\bar{z})} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y}
\end{align*}
since the argument and S are point symmetric in \( q \). Since \( B_{r_0} (z) \subset E_M \) we have that
\begin{align*}
& \int_{S\cap B_{r_0} (q) \cap B_{r_0} (z)} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y-q \rvert^{n + s}}\dd{y} + \int_{S\cap B_{r_0} (q) \cap B_{r_0} (\bar{z})} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y-q \rvert^{n + s}}\dd{y} \\
\leq & - \int_{S\cap B_{r_0} (q) \cap B_{r_0} (z)} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y} + \int_{S\cap B_{r_0} (q) \cap B_{r_0} (\bar{z})} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y} = 0. \tagged\label{eq:102}
\end{align*}
Thus, we have
\begin{align*}
\int_S \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
& \overset{\Cref{eq:102}}{\leq} \int_{S \setminus B_{r_0} (q)} \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y} + \int_{S \cap(B_{r_0} (q)\setminus (B_{r_0} (z) \cup B_{r_0} (\bar{z})))} \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y}. \tagged\label{eq:106}
\end{align*}
For the first term we will use that \( S \subset B \subset B_{R + 2} \) and for the second term we
will use Lemma 3.1 from~\cite{Dipierro2016} with the set \( P_{r_0, 1} \). We then get
\begin{align*}
\Cref{eq:106}
& \leq \int_{B_{R + 2} \setminus B_{r_0}} \frac{1}{\lvert y \rvert^{n + s}} \dd{y} + \int_{P_{r_0, 1}} \frac{1}{\lvert y \rvert^{n + s}} \dd{y} \\
& \leq c(n, s) (r_0^{-s} - {(R + 2)}^{- s}) + c^\prime (n, s) r_0^{-s} \\
& \leq c(n, s) (r_0^{-s} - {(R + 2)}^{- s}).
\end{align*}
Since we want to contradict that \( \partial B_r (t e_n) \cap \partial E_M \neq \emptyset \) for
all \( r \in (0, 1) \) and \( t \) it is enough to consider \( r_0 \) large. Indeed, if we have that
there exists some \( t_0 \) such that \( \partial B_{r_0} (t_0 e_n) \cap \partial E_M \neq
\emptyset \) for some \( r_0 \), then for all \( r \in (r_0, 1) \) the same holds as well.
Conversely, if we have that \( \partial B_{r_0} (t e_n) \cap \partial E_M = \emptyset \), then for
all \( r \in (0, r_0) \) the same holds as well. In particular, we can choose \( r_0 = 1 \)
(notice that we can choose \( r_0 = 1 \), since \( B_1 (z) \subset E_M \) and \Cref{eq:103} still
holds).
Thus, in total we have that
\begin{align*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
& \leq - c_0 M^{- s} + c_1 (2^{- s} + 1 - {(R + 2)}^{- s}) \\
& = - c_0 M^{- s} \left(1 - \frac{c_1}{c_0} \left({\left(\frac{M}{2}\right)}^s + M^s - {\left(\frac{M}{R + 2}\right)}^2\right)\right).
\end{align*}
Now we can choose \( M \) small enough such that the right-hand side is negative. Thus, we have
contradicted our assumption, that the minimizer is not \( E_0 \cup \Omega \) for \( R \geq 2 \). \\
Let us now consider the case that \( R < 2 \), then for the integral over \( C \subset \{\lvert y
\rvert \geq R \} \) we have
\begin{align*}
\int_C \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
= \int_C \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y}
\leq \int_{\lvert y \rvert \geq R} \frac{1}{\lvert y \rvert^{n + s}} \dd{y}
\leq c(n, s) R^{- s}.
\end{align*}
The integral over \( D \) and \( S \) are the same as before, but this time we can't choose \( r_0
= 1 \) since it could be that \( B_1 (z) \not\subset E_M \). Still we can choose \( r_0 =
\frac{R}{2} \), since \( B_{\frac{R}{2}} (z) \subset E_M \). Thus, we have
\begin{align*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
& \leq - c_0 M^{-s} + c_1 (R^{-s} + r_0^{-s} - {(R + 2)}^{-s}) \\
& = - c_0 M^{-s} \left(1 - \frac{c_1}{c_0} \left({\left(\frac{M}{R}\right)}^s + {\left(\frac{2M}{R}\right)}^s - {\left(\frac{M}{R + 2}\right)}^s\right)\right).
\end{align*}
Again we choose \( M \) small enough such that the right-hand side is negative.
Notice that in this case we have only shown for now that the cylinder \( Z_R \coloneqq
B^\prime_{\frac{R}{2}} \times(- M, M) \) is part of the minimizer. \\
In the classical case we could conclude connectedness of the minimizer now, since we have found a
cylinder which connects the external data, but as it turns out in the setting of nonlocal minimal
surfaces, there still could exist some disconnected part of the minimizer. More on that in
\Cref{ch:disconnected_minimizer}. \\
To prove that the minimizer is \( E_0 \cup \Omega \) we will proceed similarly, but
instead of sliding the ball down, we will push a ball outwards from inside the cylinder. Since we
have shown that the cylinder \( Z_R \) is part of the minimizer, the ball \( B_{\frac{R}{2}} (h
e_n) \) for any \( h \in (- M, M) \) is part of the minimizer. We will push this ball outwards (in
any direction, w.l.o.g.\ we can choose \( e_1 \)). We assume again, that \( E_M \neq E_0 \cup
\Omega \), then for the ball \( B_{\frac{R}{2}} ((1 - \frac{R}{2} - t)e_1 + he_n) \) with \( t \in
(0, 1 - \frac{R}{2}) \) and \( h \in (- M, M) \) there exists some \( t_0 \) and \( h_0 \) such
that the ball touches the minimizer, i.e.\ \( \partial B_{\frac{R}{2}} ((1 - \frac{R}{2} - t_0)e_1 +
h_0 e_n)\cap \partial E_M \neq \emptyset \) and for all \( t \in (t_0, 1 - \frac{R}{2}) \) we have
that \( B_{\frac{R}{2}} ((1 - \frac{R}{2} - t)e_1 + h_0 e_n) \subset E_M \). Again we define \( z
\coloneqq (1 - \frac{R}{2} - t_0)e_1 + h_0 e_n \).
Now we can estimate the integral again by splitting the domain. We define the following sets:
\begin{align*}
A & \coloneqq \{(x^\prime, x_n) \mid \lvert x^\prime - q^\prime \rvert < R, \lvert x_n - q_n \rvert < 2M \}, \\
B & \coloneqq \{(x^\prime, x_n) \mid \lvert x^\prime - q^\prime \rvert < R, \lvert x_n - q_n \rvert < R \}, \\
C & \coloneqq E_M^c \setminus B, \\
D & \coloneqq E_M \setminus A.
\end{align*}
The integral over \( C \) and \( D \) is estimated as before. We have that
\begin{align*}
\int_C \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
& \leq \int_{\lvert y \rvert \geq R} \frac{1}{\lvert y \rvert^{n + s}} \dd{y} \leq c(n, s) R^{- s}
\end{align*}
and
\begin{align*}
\int_D \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
\leq - c(n) \int_{B_M (3Me_n)} \frac{1}{\lvert y \rvert^{n + s}} \dd{y} \leq - c(n, s) M^{- s}.
\end{align*}
For the integral over the rest of the domain \( S \), we proceed as before. First we have again
\begin{gather*}
\int_{S\cap B_{\frac{R}{2}} (q) \cap B_{\frac{R}{2}} (z)} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y} = \int_{S\cap B_{\frac{R}{2}} (q) \cap B_{\frac{R}{2}} (\bar{z})} \frac{1}{\lvert y-q \rvert^{n + s}}\dd{y}.
\end{gather*}
Thus, we have
\begin{gather*}
\int_S \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
\leq \int_{S \setminus B_{\frac{R}{2}} (q)} \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y} + \int_{S \cap (B_{\frac{R}{2}} (q)\setminus (B_{\frac{R}{2}} (z) \cup B_{\frac{R}{2}} (\bar{z})))} \frac{1}{\lvert y - q \rvert^{n - s}} \dd{y}. \tagged\label{eq:105}
\end{gather*}
For the first term we will use that \( S \subset B \subset B_4 \) and for the second term we will
use Lemma 3.1 from~\cite{Dipierro2016} with the set \( P_{\frac{R}{2}, 1} \). We then
get
\begin{align*}
\Cref{eq:105}
& \leq \int_{B_4 \setminus B_{\frac{R}{2}}} \frac{1}{\lvert y \rvert^{n + s}} \dd{y} + \int_{P_{\frac{R}{2}, 1}} \frac{1}{\lvert y\rvert^{n + s}} \dd{y} \\
& \leq c(n, s) \left({\left(\frac{R}{2}\right)}^{-s} - 4^{-s}\right) + c^\prime (n, s) {\left(\frac{R}{2}\right)}^{-s} \\
& \leq c(n, s) (R^{-s}-8^{-s}).
\end{align*}
Thus, in total we have that
\begin{align*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
& \leq - c_0 M^{- s} + c_1 (r^{- s}-8^{-s}) \\
& = - c_0 M^{- s} \left(1 - \frac{c_1}{c_0} \left({\left(\frac{M}{r}\right)}^s- {\left(\frac{M}{8}\right)}^{-s}\right)\right).
\end{align*}
Thus, we can choose \( M \) small enough such that the right-hand side is negative.
\end{proof}
Interesting to see, that the contribution of the external data of the same width as the prescribed
set is enough to get connectedness of the minimizer and even stickiness to the boundary. Also see,
that the model seems to converge to the problem, considered
in~\cite{dipierro2020disconnectedness}.
\begin{proof}[Proof of \Cref{thm:102}]
We show that for \( M \) big enough, in particular we can choose \( M > 1 \), the minimizer is
disconnected. We will slide a ball of radius \( \sqrt{M} \) in \( e_1 \) direction and show, that
this ball and the minimizer don't touch for \( M \) big enough. Assume that they are touching at
some point \( q \), then since the minimizer is a viscosity solution of the Euler-Lagrange
equation, we have that
\begin{align*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y} \leq 0. \tagged\label{eq:104}
\end{align*}
Now notice that we have
\begin{align*}
\int_{\mathbb{R}^n} \frac{\chi_{E_M^c} - \chi_{E_M}}{\lvert y - q \rvert^{n - s}} \dd{y}
\geq \int_{\mathbb{R}^n} \frac{\chi_{F_M^c} - \chi_{F_M}}{\lvert y - q \rvert^{n - s}} \dd{y},
\end{align*}
where \( F_M = E_M \cup F_0 \) and \( F_0 \) being the external data from the model considered
in~\cite{dipierro2020disconnectedness}. Thus, we are in the same setting as
in~\cite{dipierro2020disconnectedness} and can conclude that for \( M \) large enough the left-hand
side of \cref{eq:104} is positive. Thus, we have a contradiction and the minimizer is disconnected.
\end{proof}
Whereas \Cref{thm:102} is consistent with the classical theory of minimal surfaces, the behavior of
the minimizer in \Cref{thm:101} is unique to nonlocal minimal surfaces.
In~\cite{dipierro2020disconnectedness} the authors have shown that the minimizer exhibits similar
behavior as we found in \Cref{thm:101} for the model considered in this chapter, however interesting
to see is that even in the case of very small external data with the same width as the prescribed
set the minimizer is connected and even sticks to the boundary for \( M \) small enough relative
to the size of \( R \). This suggests that the contribution of the external data \( E_0 \) above and
below is enough to push the minimizer to the boundary of the prescribed set \( \Omega \).