Re: MATH1131 help thread
![](https://latex.codecogs.com/png.latex?\bg_white $\noindent So the integral becomes $I = \int \frac{1}{u}\times 2\left(u-1\right)\text{ d}u$. Write $\frac{u-1}{u}$ as $1-\frac{1}{u}$, so then $I = 2\int \left(1-\frac{1}{u}\right)\text{ d}u=2u - 2\ln |u| +C$. Since $u=1+\sqrt{x}$, this becomes $2+2\sqrt{x}-2\ln \left(1+\sqrt{x}\right)+C$. Since $2+C$ is just an arbitrary constant again, we can write the final answer as $I=2\sqrt{x}-2\ln \left(1+\sqrt{x}\right)+C$.$)
If I use u = 1 + root(x)...
x = (u-1)^2
dx = 2u-2
Then putting into the original thing, integrate (2u-2)/u...
can someone take it from here and finish it off? I get very close to the answer but don't get it, I think because I'm doing incorrect algebra after that last step ^.