Integration by Parts

Integration by Parts
Integration by Parts with a definite integral
Going in Circles
Tricks of the Trade

Integrals of Trig Functions

Antiderivatives of Basic Trigonometric Functions
Product of Sines and Cosines (mixed even and odd powers or only odd powers)
Product of Sines and Cosines (only even powers)
Product of Secants and Tangents
Other Cases

Trig Substitutions

How Trig Substitution Works
Summary of trig substitution options
Completing the Square

Partial Fractions

Introduction to Partial Fractions
Linear Factors
Irreducible Quadratic Factors
Improper Rational Functions and Long Division

Strategies of Integration

Integration by Parts
Trig Integrals
Trig Substitutions
Partial Fractions

Improper Integrals

Type 1 - Improper Integrals with Infinite Intervals of Integration
Type 2 - Improper Integrals with Discontinuous Integrands
Comparison Tests for Convergence

Modeling with Differential Equations

Separable Equations
A Second Order Problem

Euler's Method and Direction Fields

Euler's Method (follow your nose)
Direction Fields
Euler's method revisited

Separable Equations

The Simplest Differential Equations
Separable differential equations
Mixing and Dilution

Models of Growth

Exponential Growth and Decay
The Zombie Apocalypse (Logistic Growth)

Linear Equations

Linear ODEs: Working an Example
The Solution in General
Saving for Retirement

Parametrized Curves

Three kinds of functions, three kinds of curves
The Cycloid
Visualizing Parametrized Curves
Tracing Circles and Ellipses
Lissajous Figures

Calculus with Parametrized Curves

Video: Slope and Area
Video: Arclength and Surface Area
Summary and Simplifications
Higher Derivatives

Polar Coordinates

Definitions of Polar Coordinates
Graphing polar functions
Video: Computing Slopes of Tangent Lines

Areas and Lengths of Polar Curves

Area Inside a Polar Curve
Area Between Polar Curves
Arc Length of Polar Curves

Conic sections

Slicing a Cone
Parabolas and Directrices
Shifting the Center by Completing the Square

Conic Sections in Polar Coordinates

Foci and Directrices
Visualizing Eccentricity
Astronomy and Equations in Polar Coordinates

Infinite Sequences

Approximate Versus Exact Answers
Examples of Infinite Sequences
Limit Laws for Sequences
Theorems for and Examples of Computing Limits of Sequences
Monotonic Covergence

Infinite Series

Geometric Series
Limit Laws for Series
Test for Divergence and Other Theorems
Telescoping Sums

Integral Test

Preview of Coming Attractions
The Integral Test
Estimates for the Value of the Series

Comparison Tests

The Basic Comparison Test
The Limit Comparison Test

Convergence of Series with Negative Terms

Introduction, Alternating Series,and the AS Test
Absolute Convergence

The Ratio and Root Tests

The Ratio Test
The Root Test

Strategies for testing Series

Strategy to Test Series and a Review of Tests
Examples, Part 1
Examples, Part 2

Power Series

Radius and Interval of Convergence
Finding the Interval of Convergence
Power Series Centered at $x=a$

Representing Functions as Power Series

Functions as Power Series
Derivatives and Integrals of Power Series
Applications and Examples

Taylor and Maclaurin Series

The Formula for Taylor Series
Taylor Series for Common Functions
Adding, Multiplying, and Dividing Power Series
Miscellaneous Useful Facts

Applications of Taylor Polynomials

Taylor Polynomials
When Functions Are Equal to Their Taylor Series
When a Function Does Not Equal Its Taylor Series
Other Uses of Taylor Polynomials

Functions of 2 and 3 variables

Functions of several variables
Limits and continuity

Partial Derivatives

One variable at a time (yet again)
Definitions and Examples
An Example from DNA
Geometry of partial derivatives
Higher Derivatives
Differentials and Taylor Expansions

Differentiability and the Chain Rule

The First Case of the Chain Rule
Chain Rule, General Case
Video: Worked problems

Multiple Integrals

General Setup and Review of 1D Integrals
What is a Double Integral?
Volumes as Double Integrals

Iterated Integrals over Rectangles

How To Compute Iterated Integrals
Examples of Iterated Integrals
Fubini's Theorem
Summary and an Important Example

Double Integrals over General Regions

Type I and Type II regions
Examples 1-4
Examples 5-7
Swapping the Order of Integration
Area and Volume Revisited

Double integrals in polar coordinates

dA = r dr (d theta)

Multiple integrals in physics

Double integrals in physics
Triple integrals in physics

Integrals in Probability and Statistics

Single integrals in probability
Double integrals in probability

Change of Variables

Review: Change of variables in 1 dimension
Mappings in 2 dimensions
Bonus: Cylindrical and spherical coordinates

A random variable is a quantity that can take on several different values, depending on chance. Examples include:

  • The number of football games that the Texas Longhorns will win this fall.
  • The number of electoral votes that the Republican candidate will get in the next presidential election.
  • Tomorrow's high temperature in Austin.
  • The $x$ coordinate of the spot when my next dart hits the dartboard.
The first two examples are discrete random variables, meaning that you make a list of all the possible outcomes, and then assign a number, called the probability, to each one. The last two examples are continuous random variables, meaning that the possible outcomes form a continuous range.

If $X$ is a continuous random variable, then the probability of any one outcome is zero. Instead, we consider the probability of a range of outcomes. The probability density function (abbreviated `pdf') $f_X(x)$ gives the probability per unit length. In the following video, we show how to use such a function, and we learn about three standard examples, called the uniform, exponential, and normal distributions.

The probability of $X$ landing somewhere between $a$ and $b$ is $$P(a \le X \le b) = \int_a^b f_X(x) dx.$$

More generally, the probability that $X$ lands somewhere in a region $R$ is $$P(X \in R) = \int_R f_X(x) dx.$$

Of course, there is a 100% chance of $X$ taking on some value, so $$\int_{-\infty}^\infty f_X(x) dx = 1.$$ If $X$ is restricted to a smaller range than $(-\infty, \infty)$, we just integrate over that range.

Example 1: Let $T \ge 0$ be the lifetime (in years) of a newly bought light bulb. Suppose that $f_T(t) = C e^{-t/3}$ for some constant $C$. What is $C$?

Solution: Since $T$ can't be negative, we only integrate from 0 to $\infty$ instead of from $-\infty$ to $\infty$. $$1 \ = \ \int_0^\infty C e^{-t/3} dt = 3C,$$ so $C=1/3$.

If $X$ is a random variable, the expectation or mean value of $X$ is the average value we get when we run the experiment over and over again. In the following video, we explain how to use integrals to compute expectations, variances, and standard deviations.

Since values between $x$ and $x + dx$ come up a fraction $f_X(x) dx$ times, the expectation of $X$ is $$E(X) = \int_{-\infty}^\infty x f_X(x) dx.$$ This average value is often denoted with the Greek letter $\mu$.

Example 2: What is the average lifetime of a light bulb if its probability density function is $f_T(t) = \frac{1}{3} e^{-t/3}$ for $t \ge 0$ (and zero if $t < 0$.)

Solution: We compute the integral

$$\mu\ = \ E(T) \ = \ \int_0^\infty t f_T(t) dt \ = \ \int_0^\infty \frac{t e^{-t/3}}{3} dt \ = \ 3.$$ (Do you remember how to evaluate that integral? Integrate by parts!) These light bulbs last an average of 3 years.

If the mean value of $X$ is $\mu$, then the variance of $X$ is the average value of $(X-\mu)^2$. This is a measure of how wide the probability distribution is, and has units of (length)${}^2$, and is denoted $Var(X)$ or $\sigma^2$.

$$\sigma^2 = Var(X) = E((X-\mu)^2) = \int_{-\infty}^\infty (x-\mu)^2 f_X(x) dx.$$ In practice, we often use a simpler formula: $$\sigma^2 = Var(X) = E(X^2) - \mu^2 = \left (\int_{-\infty}^\infty x^2 f_X(x) dx\right) - \mu^2.$$ $\sigma = \sqrt{Var(X)}$ is called the standard deviation of $X$.

Example 3: A random variable $X$ takes values between $0$ and $1$ with pdf $f_X(x) = 1$ when $0 \le x \le 1$ (and 0 otherwise). This is called the uniform distribution. Find the mean, variance and standard deviation of $X$.

Solution:We first check that $f_X(x)$ is a legitimate pdf. $\int_0^1 f_X(x) dx = \int_0^1 dx = 1$, as it should be.

Next we compute the mean: $$\mu = E(X) = \int_0^1 x dx = \frac{1}{2}.$$ Then we compute the variance: $$ Var(X) = \left(\int_0^1 x^2 dx\right ) - \mu^2 = \frac{1}{3} - \frac{1}{4} = \frac{1}{12}.$$ Finally, we compute the standard deviation: $$\sigma = \sqrt{Var(X)} = \frac{1}{2\sqrt{3}} = \frac{\sqrt{3}}{6}.$$