Power Series Convergence and Properties
30 KartenExplores the convergence, radius of convergence, and properties of power series, including operations, continuity, integration, and differentiation.
30 Karten
Power Series
This chapter explores power series, which are fundamental in analysis for representing functions as infinite sums of terms involving increasing powers of a variable. We consider $ \mathbb{K} = \mathbb{R} $ or $ \mathbb{C} $.
12.1 Convergence
Definition 12.1
A power series with variable in $ \mathbb{K} $ is any series of functions $ \sum f_n $ associated with a sequence $ (a_n) \in \mathbb{K}^{\mathbb{N}} $, where for all $ n \in \mathbb{N} $, $ f_n : \mathbb{K} \longrightarrow \mathbb{K} $ is defined as $ z \longmapsto a_n z^n $. Such a series is denoted $ \sum a_n z^n $. When it converges, its sum is $ S : \mathbb{K} \longrightarrow \mathbb{K} $ such that $ z \longmapsto \sum_{n=0}^{+\infty} a_n z^n $.
Remark 12.2
- The partial sums of a power series are polynomial functions.
- The scalars $ a_n $ are called the coefficients of the power series.
We will first study the properties of the sum function $ S $ (domain of definition, continuity, differentiability, or primitivation when $ \mathbb{K} = \mathbb{R} $). Then, we will explore conditions for a function to be the sum of a power series.
Example 12.3
- For all $ z \in \mathbb{C} $, $ \exp(z) = \sum_{n=0}^{+\infty} \frac{1}{n!} z^n $.
- For all $ z \in \mathbb{C} $ such that $ |z| < 1 $, $ \frac{1}{1 - z} = \sum_{n=0}^{+\infty} z^n $.
12.1.1 Radius of Convergence
Lemma 12.4: Abel's Lemma
Let $ \sum a_n z^n $ be a power series. If there exists $ z_0 \in \mathbb{C} \setminus \{0\} $ such that $ (a_n z_0^n)_n $ is bounded, then for all $ z \in \mathbb{C} $ with $ |z| < |z_0| $, $ \sum a_n z^n $ converges absolutely (ACV).
Definition 12.5
Let $ \sum a_n z^n $ be a power series. The radius of convergence $ R $ of $ \sum a_n z^n $ is defined as the supremum in $ \overline{\mathbb{R}_+} $ of the set $ \{\rho \in \mathbb{R}_+ \mid (a_n \rho^n)_n \text{ is bounded}\} $. That is, $ R = \sup \{\rho \in \mathbb{R}_+ \mid (a_n \rho^n)_n \text{ is bounded}\} $. This definition ensures $ R $ is a real value or $ +\infty $. If instead we used $ E = \{z \in \mathbb{C} \mid (a_n z^n)_n \text{ is bounded}\} $, the supremum property would not be directly applicable as is a subset of $ \mathbb{C} $.
Theorem 12.7
Let $ \sum a_n z^n $ be a power series with radius of convergence $ R > 0 $. For any $ z \in \mathbb{C} $:
- If $ |z| < R $, then $ \sum a_n z^n $ converges absolutely (ACV).
- If $ |z| > R $, then $ \sum a_n z^n $ diverges (GDV).
Remark 12.8
- The set $ \mathcal{D}(0,R) = \{z\in \mathbb{K}\mid |z| < R\} $ is called the open disk of convergence. Thus, $ \mathcal{D}(0,R) \subset (\text{domain of convergence}) \subset \overline{\mathcal{D}(0,R)} $.
- The theorem provides no information for $ |z| = R $. The set $ \{z \in \mathbb{K} \mid |z| = R\} $ is called the circle of uncertainty, where anything can happen (convergence or divergence).
- If $ \mathbb{K} = \mathbb{R} $, then $ \mathcal{D}(0,R) = ] - R,R[ $ is the open interval of convergence, and the circle of uncertainty consists of $ \{\pm R\} $.
- If $ R = +\infty $, $ \sum a_n z^n $ converges for all $ z \in \mathbb{C} $.
- If $ R = 0 $, $ z = 0 $ is the only point for which $ \sum a_n z^n $ converges. The domain of convergence is $ \{0\} $.
Corollary 12.9
Let $ \sum a_n z^n $ be a power series. Its radius of convergence $ R $ satisfies:
- $ R = \sup \{\rho \in \mathbb{R}_+ \mid (a_n \rho^n)_n \text{ bounded}\} $ (by definition).
- $ R = \sup \{\rho \in \mathbb{R}_+ \mid a_n \rho^n \xrightarrow{n \to +\infty} 0\} R = \sup \{\rho \in \mathbb{R}_+ \mid \sum a_n \rho^n \text{ converges}\} $.
12.1.2 Practical Determination of the Radius of Convergence
Bounding Method
This method uses properties of the circle of uncertainty and the characteristics of the radius of convergence.
Let $ \sum a_n z^n $ be a power series with radius of convergence $ R $. Let $ z_1, z_2 \in \mathbb{K} $.
- If $ (a_n z_1^n) $ is bounded, or $ a_n z_1^n \xrightarrow{n \to +\infty} 0 $, or $ \sum a_n z_1^n $ converges, then $ R \geqslant |z_1| $.
- If $ (a_n z_2^n) $ is not bounded, or $ a_n z_2^n \not\xrightarrow{n \to +\infty} 0 $, or $ \sum a_n z_2^n $ diverges, then $ R \leqslant |z_2| $.
The goal is to find $ z_1 $ and $ z_2 $ such that $ |z_1| = |z_2| $ to conclude that $ R = |z_1| = |z_2| $.
Comparison Method
Let $ \sum a_n z^n $ and $ \sum b_n z^n $ be two power series with radii of convergence $ R_a $ and $ R_b $ respectively.
- If, from a certain rank, $ |a_n| \leqslant |b_n| $, then $ R_a \geqslant R_b $. The same holds if $ a_n = O(b_n) $ or $ a_n = o(b_n) $.
- If $ |a_n| \sim |b_n| $, then $ R_a = R_b $. In particular, if $ a_n \sim b_n $, then $ R_a = R_b $.
Using D'Alembert's Ratio Test for Numerical Series
Proposition 12.12
Let $ \sum a_n z^n $ be a power series with radius of convergence $ R $. Suppose that for all $ n \in \mathbb{N} $, $ a_n \neq 0 $, and that $ \left(\frac{|a_{n+1}|}{|a_n|}\right)_n $ admits a limit $ \ell \in \overline{\mathbb{R}_+} $.
Then, $ R = \left\{ \begin{array}{ll} + \infty & \mathrm{if}\ \ell = 0\\ \frac{1}{\ell} & \mathrm{if}\ \ell \in ]0, + \infty [. \\ 0 & \mathrm{if}\ \ell = +\infty \end{array} \right. $ We can write $ R = \frac{1}{\ell} $.
This result does not directly apply to lacunary series but can be adapted by modifying the demonstration.
Proposition 12.14
For any $ \alpha \in \mathbb{R} $, the radius of convergence of the series $ \sum n^{\alpha}z^{n} $ is 1.
12.1.3 Operations on Power Series
Let $ \sum a_n z^n $ and $ \sum b_n z^n $ be two power series with radii of convergence $ R_a $ and $ R_b $, and respective sums $ S_a $ and $ S_b $. Let $ \lambda \in \mathbb{K} $.
Operations on numerical series define three power series:
- Sum: $ \sum a_n z^n + \sum b_n z^n = \sum (a_n + b_n)z^n $ with radius of convergence $ R_{a+b} $ and sum $ S_{a+b} $.
- Scalar Multiplication: $ \lambda \cdot \sum a_n z^n = \sum \lambda \cdot a_n z^n $ with radius of convergence $ R_{\lambda a} $ and sum $ S_{\lambda a} $.
- Cauchy Product: $ \left(\sum a_n z^n\right)\left(\sum b_n z^n\right) = \sum c_n z^n $ with radius of convergence $ R_{ab} $ and sum $ S_{ab} R_{\lambda a} = \left\{ \begin{array}{ll} R_a & \text{if } \lambda \neq 0 \\ +\infty & \text{if } \lambda = 0. \end{array} \right. R_{a+b}\left\{ \begin{array}{ll} = \min (R_a,R_b) & \text{if } R_a\neq R_b\\ \geqslant R_a = R_b & \text{if } R_a = R_b \end{array} \right. $ and in all cases, $ R_{a+b}\geqslant \min (R_a,R_b) R_{ab} \geqslant \min(R_a, R_b) $.
- If $ |z| < R_a $, then $ S_{\lambda a}(z) = \lambda S_a(z) $.
- If $ |z| < \min(R_a, R_b) $, then $ S_{a+b}(z) = S_a(z) + S_b(z) $.
- If $ |z| < \min(R_a, R_b) $, then $ S_{ab}(z) = S_a(z)S_b(z) $.
Theorem 12.16
12.2 Power Series of a Real Variable
Let $ \sum a_n t^n $ be a power series of a real variable $ t $ with radius of convergence $ R > 0 $. The open interval of convergence is $ ]-R, R[ $, and the circle of uncertainty is $ \{\pm R\} $.
In this section, we study $ S: ] - R, R[ \longrightarrow \mathbb{K} $, which may be defined at $ \pm R t \quad \longmapsto \sum_{n=0}^{+\infty} a_n t^n \sum \frac{t^n}{n} $: defined on $ [-1, 1[ \sum \frac{t^n}{n^2} $: defined on $ [-1, 1] \sum \frac{t^{2n}}{n} $: defined on $ ]-1, 1[ $.
Theorem 12.18
A power series of a real variable converges normally on any compact segment included in its open interval of convergence.
This theorem relates to the concept of normal convergence for series of functions (Chapter 11).
11.1.3 Normal Convergence for a Series of Functions
Definition 11.12
Assume the functions $ f_n $ are bounded on $ I $ (at least from a certain rank). We say that $ \sum f_n $ converges normally on $ I $ when the numerical series $ \sum \| f_n\|_{\infty} $ converges. We say that $ \sum f_n $ converges normally on every compact segment of $ I $ when, for every compact segment $ [a,b]\subset I $, $ \sum f_n $ converges normally on $ [a,b] \sum f_n $ converges normally (CVN) on $ I \Longrightarrow \sum f_n $ converges uniformly (CVU) on $ I \Longrightarrow \sum f_n $ converges simply (CVS) on $ I \sum f_n $ converges normally on every compact segment (CVNSTS) of $ I \Longrightarrow \sum f_n $ converges uniformly on every compact segment (CVUSTS) of $ I \Longrightarrow \sum f_n $ converges simply (CVS) on $ I $.
The converses are false.
Remark 11.15
- Evidently: $ \sum f_n $ CVN on $ I \Longrightarrow \sum f_n $ CVNSTS of $ I \Longrightarrow \forall x\in I,\sum f_n(x)$ converges absolutely (CVA).
- In all convergence modes, the sum of the series $ \sum_{n=0}^{+\infty} f_n $ can be defined. It is the same function for all types of convergence.
- CVNSTS only introduces novelty when $ I $ is not a compact segment.
12.2.1 Continuity
Theorem 12.19
The sum $ S $ of a power series is continuous on $ ]-R, R[ $.
Studying convergence at the endpoints of the interval (at $ R $ and $ -R $) is not a main objective of the curriculum. Nevertheless, such a study might be requested because all the necessary tools for this conclusion have been covered with the theorem of continuity of a series sum. All functions $ t \longmapsto a_n t^n $ are continuous on $ \mathbb{R} $, and thus at $ R $ and $ -R $. Uniform convergence on $ [0, R] $ is sufficient to conclude on the continuity of $ S $ at $ R $. Similarly, uniform convergence on $ [-R, 0] $ allows concluding on continuity at $ -R $.
This theorem relies on the continuity of the sum of a series of functions (Theorem 11.18).
11.2 Limit and Continuity
Theorem 11.18: Continuity of the Sum of a Series of Functions
If:
- For all $ n \in \mathbb{N} $, $ f_n $ is continuous on $ I \sum f_n $ converges uniformly on every compact segment (CVUSTS) of $ I $.
Then $ \sum_{n=0}^{+\infty} f_n $ is continuous on $ I $.
Example 12.20
The sum of the series $ \sum \frac{(-1)^{n+1}}{n} x^n $ is defined and continuous on $ ]-1, 1] S $ is continuous on $ ] - R, R[ $, so it admits primitives on this interval.
Theorem 12.21
The series $ \sum \frac{a_n}{n + 1} t^{n + 1} $ also has $ R $ as its radius of convergence. A primitive of $ S $ on $ ] - R, R[ $ is obtained by integrating term by term. In particular:
$ \forall x \in ] - R, R [, \int_ {0} ^ {x} S (t) \mathrm{d} t = \sum_ {n = 0} ^ {+ \infty} \frac {a _ {n}}{n + 1} x ^ {n + 1} $ (primitive series that is zero at 0).
Remark 12.22
- The radius of convergence is invariant under term-by-term integration.
- This is a term-by-term integration of a series that converges normally on every compact segment (CVNSTS), which enables the interchange of integral and sum (Theorem 11.25).
11.3 Integration on a compact segment
Theorem 11.25: Term-by-Term Integration Theorem on a compact segment
If:
- For all $ n \in \mathbb{N} $, $ f_n \in \mathcal{C}([a,b],\mathbb{K}) \sum f_n $ converges uniformly (CVU) on $ [a,b] \sum \int_{a}^{b}f_{n} $ converges (CV).
- $ \int_{a}^{b}\sum_{n = 0}^{+\infty}f_n(t)\mathrm{d}t = \sum_{n = 0}^{+\infty}\int_{a}^{b}f_{n}(t)\mathrm{d}t $.
These theorems apply ONLY on a compact segment.
Example 12.23
The radius of convergence of $ \sum (-1)^n t^n $ is 1. By primitivation, for all $ x\in ] - 1,1[ $, $ \ln (1 + x) = \overbrace{\sum_{n = 1}^{+\infty}\frac{(-1)^{n + 1}}{n} x^n}^{\widetilde{S}(x)} $. This allows calculating $ \sum_{n = 1}^{+\infty}\frac{(-1)^{n + 1}}{n} $.
12.2.3 Derivation
Theorem 12.24
For any $ k \in \mathbb{N} $, the series $ \sum \frac{n!}{(n - k)!} a_n t^{n - k} $ also has $ R $ as its radius of convergence. $ S $ is of class $ \mathcal{C}^{\infty} $ on $ ] - R, R[ $, and for all $ k \geqslant 1 $, $ S^{(k)} $ is obtained by term-by-term differentiation:
$ \forall k \in \mathbb {N}, \forall t \in ] - R, R [, S ^ {(k)} (t) = \sum_ {n = k} ^ {+ \infty} \frac {n !}{(n - k) !} a _ {n} t ^ {n - k} $ (kth derivative series).
In particular: $ \forall k\in \mathbb{N}, a_{k} = \frac{S^{(k)}(0)}{k!} $.
This theorem relies on the term-by-term differentiation theorem (Theorem 11.30).
11.4 Differentiation
Théorème 11.30: Term-by-Term Differentiation Theorem
If:
- For all $ n \in \mathbb{N} $, $ f_n \in \mathcal{C}^1 (I,\mathbb{K}) \sum f_n $ converges simply (CVS) on $ I \sum f_n^{\prime} $ converges uniformly on every compact segment (CVUSTS) of $ I \sum_{n=0}^{+\infty} f_n $ is of class $ \mathcal{C}^1 $ on $ I \left(\sum_{n=0}^{+\infty} f_n\right)' = \sum_{n=0}^{+\infty} f_n' $.
Remark 12.25
- The radius of convergence is invariant under term-by-term differentiation.
- In particular: $ \sum a_n z^n $ and $ \sum n a_n z^n $ have the same radius of convergence.
- The particular case $ k = 1 $ gives $ \forall t \in ] - R, R [, S'(t) = \sum_{n=1}^{+\infty} n a_n t^{n-1} $ (with the same radius of convergence $ R \forall x \in ] - 1, 1 [, \frac{1}{1 + x} = \sum_{n = 0}^{+\infty} (-1)^n x^n $. By differentiation, $ \forall x \in ] - 1, 1 [, \frac{-1}{(1 + x)^2} = \sum_{n = 1}^{+\infty} n(-1)^n x^{n - 1} $.
12.3 Power Series Expansions (Real Variable)
Definition 12.27
Let $ f $ be defined on a neighborhood $ ] - r, r[ $ of 0 where $ r > 0 $. We say that $ f $ is developable in power series (DSE(0)) on $ ] - r, r[ $ if there exists a power series $ \sum a_n x^n $ with a radius of convergence $ R > 0 $ such that $ R \geqslant r $ and $ \forall x \in ] - r, r[ $, $ f(x) = \sum_{n=0}^{+\infty} a_n x^n $.
12.3.1 Existence and Properties of a DSE(0)
Let $ f $ be defined on a neighborhood $ ] - r, r[ $ of 0 where $ r > 0 $.
Definition 12.28
If $ f $ is of class $ \mathcal{C}^{\infty} $ in the neighborhood of 0, then the power series $ \sum \frac{f^{(n)}(0)}{n!} x^n $ is called the Taylor series of $ f $ at 0.
Proposition 12.29
If $ f $ is developable in power series (DSE(0)) on $ ] - r, r[ $, then $ f \in \mathcal{C}^{\infty}(] - r, r[) $ and its DSE(0) is the Taylor series of $ f $ at 0.
Remark 12.30
- In particular, if it exists, the DSE(0) of a function is unique.
- Functions with a DSE(0) must therefore be sought among functions of class $ \mathcal{C}^{\infty} $ in the neighborhood of 0.
- There exist functions of class $ \mathcal{C}^{\infty} $ in the neighborhood of 0 that are not DSE(0).
Example 12.31
The function $ f : \mathbb{R} \longrightarrow \mathbb{R} $, given by $ x \longmapsto \begin{cases} \mathrm{e}^{-\frac{1}{x^2}} & \text{if } x \neq 0 \\ 0 & \text{if } x = 0 \end{cases} $ is of class $ \mathcal{C}^{\infty} $ on $ \mathbb{R} $, hence in the neighborhood of 0, but is not DSE(0).
12.3.2 Common Expansions
In the following expansions, $ R $ always denotes the radius of convergence.
Geometric Series and Consequences
- $ \sum z^n $ has a radius of convergence $ R=1 $. As a geometric series: $ \forall x \in ] - 1, 1[ $, $ \frac{1}{1 - x} = \sum_{n=0}^{+\infty} x^n $ ().
- For any $ x \in ] - 1, 1[ $, $ -x \in ] - 1, 1[ $, so $ \forall x \in ] - 1, 1[ $, $ \frac{1}{1 + x} = \sum_{n=0}^{+\infty} (-1)^n x^n $ ().
- By integrating the previous DSE(0): $ \forall x \in ] - 1, 1[ $, $ \ln(1 + x) = \sum_{n=0}^{+\infty} \frac{(-1)^n}{n + 1} x^{n+1} $ ()
- Using the DSE(0) of $ x \longmapsto \frac{1}{1 + x^2} $, we get: $ \forall x \in ] - 1, 1[ $, $ \arctan(x) = \sum_{n=0}^{+\infty} \frac{(-1)^n}{2n + 1} x^{2n+1} $ ().
Exponential and Trigonometric Functions
In Chapter 2, we defined $ \exp(Z) = \sum_{n=0}^{+\infty} \frac{Z^n}{n!} $ for all $ Z \in \mathbb{C} $. Let $ z \in \mathbb{C} $ and $ f_z: \mathbb{R} \longrightarrow \mathbb{C} $ be $ x \longmapsto \exp(zx) $.
Then, $ \forall x \in \mathbb{R} $, $ f_z(x) = \sum_{n=0}^{+\infty} \frac{z^n}{n!} x^n $ (*).
For $ z = 1 $: $ \forall x \in \mathbb{R} $, $ f_1(x) = \sum_{n=0}^{+\infty} \frac{x^n}{n!} $ (). Differentiating this DSE(0) term by term (conserving the RDC), we get $ f_1'(x) = f_1(x) $ for all $ x \in \mathbb{R} $. Since $ f_1(0) = 1 $, $ f_1 $ is the real exponential function.
Consequently, $ \boxed{\forall x \in \mathbb{R}, \mathrm{e}^x = \sum_{n=0}^{+\infty} \frac{x^n}{n!} \ (R = +\infty).} $
By linear combinations, since $ \mathrm{e}^{-x} = \sum_{n=0}^{+\infty} \frac{(-1)^n x^n}{n!} $ for all $ x \in \mathbb{R} \boxed{\forall x \in \mathbb{R}, \operatorname{ch}(x) = \sum_{n=0}^{+\infty} \frac{x^{2n}}{(2n)!} \ (R = +\infty)} \qquad ; \qquad \boxed{\operatorname{sh}(x) = \sum_{n=0}^{+\infty} \frac{x^{2n+1}}{(2n+1)!} \ (R = +\infty).} $
With $ z = \pm i $ in (*) according to Euler's formulas:
$ \boxed{\forall x \in \mathbb{R}, \cos(x) = \sum_{n=0}^{+\infty} \frac{(-1)^n x^{2n}}{(2n)!} \ (R = +\infty) \text{ and } \sin(x) = \sum_{n=0}^{+\infty} \frac{(-1)^n x^{2n+1}}{(2n+1)!} \ (R = +\infty).} $
Binomial Series $ x \longmapsto (1 + x)^\alpha $ where $ \alpha \in \mathbb{R} $
- If $ \alpha \in \mathbb{N} $, then, by the binomial formula, $ \forall x \in \mathbb{R} $, $ (1 + x)^\alpha = \sum_{n=0}^{\alpha} \binom{\alpha}{n} x^n $ ().
- If $ \alpha \notin \mathbb{N} $, then $ \boxed{\forall x \in ]-1,1[, \ (1 + x)^\alpha = 1 + \sum_{n=1}^{+\infty} \frac{\alpha(\alpha - 1) \cdots (\alpha - n + 1)}{n!} x^n \ (R = 1).} $ This DSE(0) can be found by solving the differential equation $ (1 + x)y' - \alpha y = 0 $, of which $ x \longmapsto (1 + x)^\alpha $ is the unique solution equal to 1 at 0.
Particular Cases
- $ \alpha = -1 $: already seen: $ \frac{1}{1+x} = \sum_{n=0}^{+\infty} (-1)^n x^n \alpha = -\frac{1}{2} : \forall x \in ]-1,1[, \ \frac{1}{\sqrt{1 + x}} = \overbrace{1}^{b_0} + \sum_{n=1}^{+\infty} \overbrace{\frac{1 \cdot 3 \cdots (2n - 1)}{2^n n!}}^{b_n} (-1)^n x^n. $
Also, $ \forall n \in \mathbb{N} $, $ b_n = \frac{(2n)!}{2^{2n}(n!)^2} $. So $ \forall x \in ]-1,1[, \ \frac{1}{\sqrt{1 + x}} = \sum_{n=0}^{+\infty} \frac{(2n)!}{2^{2n}(n!)^2} (-1)^n x^n \ (R = 1). $
Chapter 12: Power Series
This chapter introduces power series in a variable (or for real variables), which are functions of the form . We will explore their convergence properties, how to determine their radius of convergence, and their analytical properties such as continuity, integrability, and differentiability.
12.1 Convergence
Definition 12.1: Power Series
A power series is a functional series where .
It is denoted as .
If it converges, its sum is .
Remark 12.2:
Partial sums are polynomial functions.
are the coefficients.
Examples 12.3:
.
for all with .
12.1.1 Radius of Convergence
Lemma 12.4 (Abel's Lemma):
If is bounded for some , then converges absolutely (ACV) for all such that .
Definition 12.5: Radius of Convergence ()
The radius of convergence () of is the supremum of all such that is bounded.
Theorem 12.7: Convergence Criteria based on R
Given a power series with radius of convergence :
If converges absolutely (ACV).
If diverges grossly (GDV).
Remark 12.8: Implications of R
The set is the open disk of convergence.
The theorem says nothing for (the circle of uncertainty); convergence behavior varies.
If , is the open interval of convergence.
If , the series converges for all .
If , the series only converges at .
Corollary 12.9: Alternative Definitions of R
.
.
12.1.2 Practical Determination of a Radius of Convergence
1. Bounding Method:
Given with radius , and :
If is bounded, or , or converges .
If is not bounded, or , or diverges .
Goal: Find such that to conclude .
2. Comparison Method:
Given (radius ) and (radius ):
If (for large enough or or ) .
If .
3. d'Alembert's Ratio Test (for non-zero ):
If , then:
If .
If .
If .
Caution: This does not apply to "lacunary" series (series with many zero coefficients).
Proposition 12.14:
For any , the radius of convergence of is .
12.1.3 Operations on Power Series
Let (radius , sum ) and (radius , sum ). .
Theorem 12.15: Radii of Combined Series
Scalar Multiplication (): if , and if .
Addition (): .
If , then .
If , then .Cauchy Product (): .
Theorem 12.16: Sums of Combined Series
For , .
For , .
For , .
12.2 Power Series of a Real Variable
For with real variable and , the interval of convergence is . We study its sum on this interval.
Theorem 12.18: Normal Convergence
A power series of a real variable converges normally on any closed and bounded interval (segment) contained within its open interval of convergence.
12.2.1 Continuity
Theorem 12.19: Continuity of the Sum
The sum of a power series is continuous on .
Continuity at the endpoints requires uniform convergence on or .
(Refer to Chapter 11, Theorem 11.18: Continuity of the Sum of a Series of Functions)
If are continuous on and converges uniformly on every segment of , then is continuous on .
12.2.2 Integration (Primitivation)
Theorem 12.21: Term-by-Term Integration
The series (primitive series) has the same radius of convergence .
A primitive of on is obtained by integrating term-by-term:
.
Remark 12.22:
The radius of convergence is invariant under term-by-term integration.
12.2.3 Differentiation
Theorem 12.24: Term-by-Term Differentiation
For any , the -th derivative series has the same radius of convergence .
The sum is class on .
For any , is obtained by differentiating term-by-term:
.
Crucially: .
Remark 12.25:
The radius of convergence is invariant under term-by-term differentiation.
and have the same radius of convergence.
(Refer to Chapter 11, Theorem 11.30: Term-by-Term Differentiation)
If , converges simply on , and converges uniformly on every segment of , then is on and its derivative is .
12.3 Power Series Expansions (Real Variable)
Definition 12.27: Developable in Power Series (DSE(0))
A function is developable in power series (DSE(0)) on if there exists a power series with such that for all .
12.3.1 Existence and Properties of a DSE(0)
Definition 12.28: Taylor Series at 0
If in a neighborhood of 0, its Taylor series at 0 is .
Proposition 12.29: Uniqueness of DSE(0)
If is DSE(0) on , then and its DSE(0) is unique and is its Taylor series at 0.
Remark 12.30:
DSE(0) functions must be .
Not all functions are DSE(0) (e.g., , ).
12.3.2 Standard Power Series Expansions
Geometric Series and Consequences ( unless specified):
Exponential and Trigonometric Functions:
()
)
()
()
()
Binomial Series :
If , ().
If , for , ().
Particular case for : ().
Chapter 11: Sequences and Series of Functions (Reference Principles)
This chapter provides the foundational concepts used for power series convergence and properties.
11.1 Convergences
11.1.1 Simple Convergence (Pointwise)
Definition 11.1: A sequence of functions converges simply (CVS) to on if for every , the numerical sequence converges to .
Definition 11.2: A series of functions converges simply (CVS) on if for every , the numerical series converges. Its sum is .
11.1.2 Uniform Convergence (CVU)
Definition 11.3: A sequence of functions converges uniformly (CVU) to on if . ( depends on only, not ).
CVUSTS: Converges uniformly on every segment (closed and bounded interval) of .
Theorem 11.5: Hierarchy of Convergences
CVU on CVUSTS of CVS on .
The limit function is the same for all types of convergence.
The converses are generally false.
Proposition 11.6: Characterization of CVU
, where .
Definition 11.9: CVU for Series
A series of functions converges uniformly (CVU) on if its sequence of partial sums converges uniformly on .
Proposition 11.10: Characterization of CVU for Series
CVU on CVS on and the sequence of remainders converges uniformly to 0.
11.1.3 Normal Convergence (CVN) for a Series of Functions
Definition 11.12: Normal Convergence (CVN)
A series of functions converges normally (CVN) on if the numerical series converges (assuming are bounded on ).
CVNSTS: Converges normally on every segment of .
Theorem 11.14: Hierarchy with CVN
CVN on CVU on CVS on
mark>.
CVNSTS of CVUSTS of CVS on .
Remark 11.15: Absolute Convergence
CVN on converges absolutely.
11.2 Limit and Continuity
Theorem 11.18: Continuity of the Sum of a Series of Functions
If is continuous on , and converges uniformly on every segment of , then the sum function is continuous on .
Theorem 11.21: Double Limit Permutation
If has a limit at point , and converges uniformly on , then converges and .
CVU (not just CVUSTS) is essential for double limit permutation.
11.3 Integration on a Segment
Here, is a segment .
Theorem 11.25: Term-by-Term Integration on a Segment
If , and converges uniformly on , then converges and .
This applies ONLY on a segment.
11.4 Differentiation
Theorem 11.30: Term-by-Term Differentiation for Series
If , converges simply on , and converges uniformly on every segment of , then the sum function is class on and its derivative is .
Theorem 11.31: Extension to Functions
If the conditions are generalized for -th derivatives (each CVS for and CVUSTS), then the sum is class and derivatives can be taken term-by-term up to order .
Chapter 12: Power Series
This chapter introduces power series, focusing on their convergence, properties, and manipulation.
12.1 Convergence
Definition 12.1: Power Series
A power series is a function series ∑ where .
It is denoted ∑ .
When it converges, its sum is .
Remark 12.2:
Partial sums are polynomial functions.
are the coefficients of the series.
Examples 12.3:
∀ , .
∀ , .
12.1.1 Radius of Convergence
Lemma 12.4: Abel's Lemma
If there exists such that is bounded, then for any with , the series ∑ is absolutely convergent (ACV).
Definition 12.5: Radius of Convergence (R)
.
This defines the boundary of convergence.
Theorem 12.7: Convergence Criteria for Power Series
If : The series ∑ is Absolutely Convergent (ACV).
If : The series ∑ is Grossly Divergent (GDV).
Remark 12.8: Key Terminology
Disk of Convergence: . This is the open disk where the series converges.
Circle of Uncertainty: The theorem states nothing for . Anything can happen on this circle.
For , (open interval of convergence).
If : Series converges for all .
If : Series converges only for .
Corrolary 12.9: Alternative Expressions for R
(Definition)
12.1.2 Practical Determination of R
Methods:
Bracketing (Encadrement):
If is bounded OR OR CV: then .
If is not bounded OR does not tend to 0 OR DV: then .
Goal: Find such that to conclude .
Comparisons: Let ∑ (RDC ) and ∑ (RDC ).
If (eventually), then . (Also true for or ).
If , then .
d'Alembert's Rule for Numerical Series (Ratio Test):
Proposition 12.12: If and has a limit :
If .
If .
If .
Caution: Not applicable for lacunary series (many are zero).
Proposition 12.14: For any , the radius of convergence of is 1.
12.1.3 Operations on Power Series
Let ∑ (RDC ) and ∑ (RDC ).
Scalar Multiplication: .
Theorem 12.15 (1): if , if .
Theorem 12.16 (1): If , .
Addition: .
Theorem 12.15 (2): . If , then .
Theorem 12.16 (2): If , .
Cauchy Product: .
Theorem 12.15 (3): .
Theorem 12.16 (3): If , .
12.2 Power Series of a Real Variable
Let ∑ be a power series with real variable and RDC . The open interval of convergence is . Functions are studied for .
Theorem 12.18: Normal Convergence on Compact Intervals
A power series converges normally on any segment (compact interval) within its open interval of convergence.
12.2.1 Continuity
Theorem 12.19: Continuity of the Sum of a Power Series
The sum of a power series is continuous on .
Border points (): Continuity at these points might be established if the series converges uniformly on the relevant closed interval (e.g., ).
12.2.2 Primitive (Integration)
Theorem 12.21: Term-by-Term Integration
The series (a primitive of the original series) has the same radius of convergence, .
A primitive of on is obtained by integrating term-by-term:
.
The radius of convergence is invariant under integration.
12.2.3 Derivation
Theorem 12.24: Term-by-Term Differentiation
For any , the series (the -th derivative series) has the same radius of convergence, .
is .
The -th derivative is obtained by differentiating term-by-term:
.
An important consequence: .
The radius of convergence is invariant under differentiation.
12.3 Power Series Expansions (Real Variable)
Definition 12.27: Developable in Power Series (DSE(0))
A function is developable in power series on if there exists a power series ∑ with such that for all .
12.3.1 Existence and Properties of DSE(0)
Definition 12.28: Taylor Series at 0
If near 0, its Taylor series at 0 is .
Proposition 12.29: Uniqueness of DSE(0)
If is DSE(0) on , then and its DSE(0) is unique, given by its Taylor series at 0.
Warning: Being does not guarantee DSE(0) (e.g., at ).
12.3.2 Usual Power Series Expansions
Here, denotes the radius of convergence.
Geometric Series: , ()
, ()
, () (by integration)
, () (from )
Exponential and Trigonometric Functions: ( for all)
Binomial Series:
If : , ()
If : , ()
Special case: , ()
Chapter 11: Sequences and Series of Functions (Recap for context)
This chapter lays the groundwork for understanding convergence concepts crucial for power series.
>
11.1 Convergences
11.1.1 Simple Convergence (CVS)
Sequence CVS: For each , converges to a limit .
Series ∑ CVS: For each , the numerical series ∑ converges to a sum .
11.1.2 Uniform Convergence (CVU)
Sequence CVU on : depends only on , not on . for all .
CVUSTS (Uniform Convergence on Every Compact Subset): Applies to any compact .
Relation: CVU CVUSTS CVS. (Reverses are generally false).
Characterization (Proposition 11.6): CVU to .
Series ∑ CVU: The sequence of partial sums converges uniformly.
Characterization (Proposition 11.10): ∑ CVU ∑ CVS AND the sequence of remainders converges uniformly to 0.
11.1.3 Normal Convergence (CVN) for a Series of Functions
Definition 11.12: ∑ CVN on if converges (where ).
CVNSTS (Normal Convergence on Every Compact Subset): Applies to any compact .
Relation: CVN CVU CVS. (Reverses are generally false).
Important: ∑ CVN on ∑ is Absolutely Convergent (CVA) for all .
11.2 Limit and Continuity
Theorem 11.17 (Continuity of Limit of Sequence) & Theorem 11.18 (Continuity of Sum of Series)
If are continuous and converge CVUSTS to , then is continuous.
If are continuous and ∑ converges CVUSTS, then its sum function is continuous.
Theorem 11.21: Double Limit Theorem
If has limit at , AND ∑ CVU on , then:
∑ converges.
.
Key: Requires CVU (not just CVUSTS) for the series.
11.3 Integration on a Segment
These theorems apply only on compact intervals .
Theorem 11.24: Interchange of Limit and Integral
If is a sequence of continuous functions, and converges CVU on to , then .
Theorem 11.25: Term-by-Term Integration of a Series
If is a sequence of continuous functions, and ∑ converges CVU on , then:
∑ converges.
.
11.4 Derivation
Theorem 11.27: Differentiability of Limit of Sequence
If , CVS to , and CVUSTS to , then is and .
Theorem 11.30: Term-by-Term Differentiation of a Series
If , ∑ CVS on , and ∑ CVUSTS on , then:
∑ is on .
.
Extension to (Theorem 11.31): If all derivatives up to converge simply, and the -th derivative series converges CVUSTS, then the sum is and derivatives can be taken term-by-term up to order .
Chapter 12: Power Series
This chapter introduces power series, their convergence properties, and operations, with a focus on real variable analysis.
12.1 Convergence
Definition 12.1: Power Series
A power series in variable z with coefficients an is denoted as .
When it converges, its sum is .
Important: Partial sums are polynomial functions.
Example 12.3
for all .
for all with .
12.1.1 Radius of Convergence
Lemma 12.4: Abel's Lemma
If is bounded for some , then is absolutely convergent (ACV) for all with .
Definition 12.5: Radius of Convergence (R)
.
This ensures is a single, well-defined value in .
Theorem 12.7: Convergence Criteria
If ACV.
If diverges (GDV).
The region \{</p></blockquote><p style="text-align: left;"></p><p style="text-align: left;">z \in \mathbb{K} \mid |z| < R\} is the open disk of convergence . The behavior on the boundary (circle of uncertainty) is unknown from this theorem alone.
If , the series converges for all .
If , the series converges only for .
Corollary 12.9: Alternative Definitions of R
(Definition)
12.1.2 Practical Determination of the Radius of Convergence
Bounding Method
If is bounded (or , or CV), then .
If is unbounded (or , or DV), then .
Goal: Find such that to conclude .
Comparison Method
For (RDC ) and (RDC ):
If for large , then . (Similarly for or ).
If , then . (This includes ).
d'Alembert's Ratio Test (for series with )
If (where ):
If .
If .
If .
Caution: This rule does not directly apply to lacunary series (series with many zero coefficients), though similar principles can be adapted.
Proposition 12.14
For any , the power series has a radius of convergence .
12.1.3 Operations on Power Series
Given (RDC , sum ) and (RDC , sum ).
Scalar Multiplication: .
RDC if , and if .
Sum: for .
Addition: .
RDC .
If .
If , then .
Sum: for .
Cauchy Product: .
RDC .
Sum: for .
12.2 Power Series of a Real Variable
For a power series with a real variable t, the open interval of convergence is , and the circle of uncertainty is .
Theorem 12.18: Normal Convergence
A power series of a real variable converges normally on any compact interval (segment) within its open interval of convergence .
12.2.1 Continuity
Theorem 12.19: Continuity of the Sum
The sum of a power series is continuous on the open interval of convergence .
Continuity at the endpoints can be shown if there is uniform convergence on the respective closed intervals (e.g., or ).
12.2.2 Integration (Term by Term)
Theorem 12.21: Integration and RDC
The series (primitive series) has the same radius of convergence .
A primitive of on is obtained by integrating term by term:
.
RDC is invariant under term-by-term integration.
12.2.3 Differentiation (Term by Term)
Theorem 12.24: Differentiation and RDC
For any , the series (derived series) has the same radius of convergence .
The sum is infinitely differentiable () on .
The derivative is found by differentiating term by term:
.
Coefficients relation: .
RDC is invariant under term-by-term differentiation. Specifically, and have the same RDC.
12.3 Power Series Expansions (Real Variable)
Definition 12.27: Developable in a Power Series (DSE(0))
A function is DSE(0) on .
Definition 12.28: Taylor Series at 0
If near 0, its Taylor series is .
Proposition 12.29: Uniqueness of DSE(0)
If is DSE(0) on , then and its DSE(0) is uniquely its Taylor series at 0.
Caution: Not all functions are DSE(0) (e.g., , ).
12.3.2 Common Power Series Expansions
Geometric Series:
, for .
, for .
Logarithm:
, for .
Arctangent:
, for .
Exponential:
, for .
Hyperbolic Functions (Real): for .
.
.
Trigonometric Functions (Real): for .
.
.
Binomial Series:
If , , .
If , , for .
Special case : , .
Chapter 11: Sequences
and Series of Functions (Context for Power Series)
This chapter provides the foundational concepts of convergence for sequences and series of functions, which are crucial for understanding power series.
11.1 Convergences
11.1.1 Simple Convergence (Pointwise)
Definition: converges simply on to if for each , the numerical sequence converges to as .
For Series: converges simply if for each , the numerical series converges. Its sum is .
11.1.2 Uniform Convergence
Definition: converges uniformly on to if for any , there exists such that for all and for all , .
depends only on , not on . This is a uniform majoration.
Characterization: , where .
Theorem 11.5: Implication Chain
CVU on CVUSTS on CVS on . (Reverses are false.)
For Series: converges uniformly if the sequence of partial sums converges uniformly.
11.1.3 Normal Convergence (for Series of Functions)
Definition: converges normally on if each is bounded on and the numerical series converges.
Characterization (Weierstrass M-Test): CVN on such that converges and for all .
Theorem 11.14: Implication Chain for Series
CVN on CVU on CVS on . (Reverses are false.)
Note: CVN on Absolute Convergence (CVA) for each .
11.2 Limit and Continuity
Theorem 11.17: Continuity of the Limit of a Sequence
If are continuous on and CVUSTS on to , then is continuous on .
Theorem 11.18: Continuity of the Sum of a Series
If are continuous on and CVUSTS on , then is continuous on .
Theorem 11.21: Double Limit (Interchanging Limit and Sum)
If has a limit at (an endpoint of ) and CVU on , then:
converges.
.
Requires CVU, not just CVUSTS.
11.3 Integration over a Segment
Theorem 11.24: Interchanging Limit and Integral (for Sequences)
If are continuous on and CVU on to , then .
Theorem 11.25: Term-by-Term Integration (for Series)
If are continuous on and CVU on , then converges and .
These theorems apply ONLY on a compact segment [a,b].
11.4 Differentiation
Theorem 11.27: Differentiability of the Limit of a Sequence
If , CVS on to , and CVUSTS on to , then and (i.e., ).
Theorem 11.30: Term-by-Term Differentiation (for Series)
If , CVS on , and CVUSTS on , then is and .
Extension to : If and derivatives up to converge simply, and the -th derivative series converges uniformly, then the sum is and derivatives can be interchanged.
Quiz starten
Teste dein Wissen mit interaktiven Fragen