# First Year Mathematics B (Integration, Series, Discrete Maths & Modelling) (1 Viewer)

#### Flop21

##### Well-Known Member

$\bg_white \noindent For your second attempt at the question, you equated the f_{y} (x,y) = 2xy + h'(y) to f_{x}(x,y) instead of to f_{y}(x,y), which is why it yielded the wrong answer.$
But that still won't get me the answer of 3x^2 y + y^3 = A. will it?

2xy + h'(y) = x^2+y^2

#### InteGrand

##### Well-Known Member

But that still won't get me the answer of 3x^2 y + y^3 = A. will it?

2xy + h'(y) = x^2+y^2
$\bg_white \noindent I realised you also mis-integrated the f_{x}(x,y) = 2xy. Partially integrating this wrt x yields f(x,y) = x^{2}y + h(y) (you did xy^{2} instead of x^2 y).$

#### Flop21

##### Well-Known Member

$\bg_white \noindent I realised you also mis-integrated the f_{x}(x,y) = 2xy. Partially integrating this wrt x yields f(x,y) = x^{2}y + h(y) (you did xy^{2} instead of x^2 y).$
Oh so I think I'm close...

I keep getting down to x^2 y + y^3/3 + C... when the final answer is 3x^2 y + y^3 = A. What have they done between my step and the final answer?

#### InteGrand

##### Well-Known Member

Oh so I think I'm close...

I keep getting down to x^2 y + y^3/3 + C... when the final answer is 3x^2 y + y^3 = A. What have they done between my step and the final answer?
$\bg_white \noindent As I said before, those answers are equivalent. Multiplying your answer through by 3 makes it 3x^{2}y + y^{3} = 3C. But since C was an arbitrary constant, 3C is just as arbitrary, so we can just replace it with A to signify an arbitrary constant. So the solution is 3x^2 y + y^3 = A, for A constant. You could also just leave your answer as x^{2} y + \frac{y^{3}}{3} = C and it'd be equally as valid.$

#### Flop21

##### Well-Known Member

$\bg_white \noindent As I said before, those answers are equivalent. Multiplying your answer through by 3 makes it 3x^{2}y + y^{3} = 3C. But since C was an arbitrary constant, 3C is just as arbitrary, so we can just replace it with A to signify an arbitrary constant. So the solution is 3x^2 y + y^3 = A, for A constant. You could also just leave your answer as x^{2} y + \frac{y^{3}}{3} = C and it'd be equally as valid.$
Oh I missed you saying this.

OKAY GREAT.

Just got the next one after this one correct anyway, so I'm on the right track now.

Thank you.

#### leehuan

##### Well-Known Member

$\bg_white \text{Let }T:V\to V\text{ be a linear map over the vector space }V\text{ and suppose the following are subsets of }V: S=\{\textbf{v}_1,\textbf{v}_2,\textbf{v}_3\}\\ R=\{T(\textbf{v}_1),T(\textbf{v}_2),T(\textbf{v}_3)\}$

$\bg_white \text{Prove that if }R\text{ is a linearly independent set, then }S\text{ is a linearly independent set.}$

$\bg_white \text{So I get what it is they're trying to say. So I arrive at considering this}\\ T(\lambda_1\textbf{v}_1+\lambda_2\textbf{v}_2+ \lambda_3 \textbf{v}_3)=\textbf{0}\\ \iff \lambda_j=0\, \forall j$

$\bg_white \text{But am I allowed to just use }T(\textbf{x})=\textbf{0} \iff \textbf{x}=\textbf{0}?\\ \text{Because I feel like this is wrong. Hence I don't know how to finish the proof off.}$

#### InteGrand

##### Well-Known Member

$\bg_white \text{Let }T:V\to V\text{ be a linear map over the vector space }V\text{ and suppose the following are subsets of }V: S=\{\textbf{v}_1,\textbf{v}_2,\textbf{v}_3\}\\ R=\{T(\textbf{v}_1),T(\textbf{v}_2),T(\textbf{v}_3)\}$

$\bg_white \text{Prove that if }R\text{ is a linearly independent set, then }S\text{ is a linearly independent set.}$

$\bg_white \text{So I get what it is they're trying to say. So I arrive at considering this}\\ T(\lambda_1\textbf{v}_1+\lambda_2\textbf{v}_2+ \lambda_3 \textbf{v}_3)=\textbf{0}\\ \iff \lambda_j=0\, \forall j$

$\bg_white \text{But am I allowed to just use }T(\textbf{x})=\textbf{0} \iff \textbf{x}=\textbf{0}?\\ \text{Because I feel like this is wrong. Hence I don't know how to finish the proof off.}$
$\bg_white \noindent We'll show the contrapositive, i.e. that if S is dependent, then R is dependent (we'll do it for arbitrary n elements, since it works the same).$

$\bg_white \noindent Suppose S is dependent, then there exist scalars \alpha_{1},\ldots, \alpha_{n} not all 0 such that \alpha_{1}\mathbf{v}_{1} + \cdots + \alpha_{n} \mathbf{v}_{n} = \mathbf{0}. Apply T to both sides, using linearity on LHS and T\left(\mathbf{0}\right) = \mathbf{0} on the RHS, then we have \alpha_{1}T\left(\mathbf{v}_{1}\right) + \cdots + \alpha_{n} T\left(\mathbf{v}_{n}\right) = \mathbf{0}. So we have expressed \mathbf{0} as a non-trivial linear combination of the elements of R (since not all \alpha_{j} were 0), whence R is a linearly dependent set. This completes the proof.$

#### Drsoccerball

##### Well-Known Member

$\bg_white \text{Let }T:V\to V\text{ be a linear map over the vector space }V\text{ and suppose the following are subsets of }V: S=\{\textbf{v}_1,\textbf{v}_2,\textbf{v}_3\}\\ R=\{T(\textbf{v}_1),T(\textbf{v}_2),T(\textbf{v}_3)\}$

$\bg_white \text{Prove that if }R\text{ is a linearly independent set, then }S\text{ is a linearly independent set.}$

$\bg_white \text{So I get what it is they're trying to say. So I arrive at considering this}\\ T(\lambda_1\textbf{v}_1+\lambda_2\textbf{v}_2+ \lambda_3 \textbf{v}_3)=\textbf{0}\\ \iff \lambda_j=0\, \forall j$

$\bg_white \text{But am I allowed to just use }T(\textbf{x})=\textbf{0} \iff \textbf{x}=\textbf{0}?\\ \text{Because I feel like this is wrong. Hence I don't know how to finish the proof off.}$
By definition T(0) = 0.

#### leehuan

##### Well-Known Member

By definition T(0) = 0.
But I do not believe that the converse need be true.

If T(x)=0 then I don't believe, in every situation, x=0

#### InteGrand

##### Well-Known Member

By definition T(0) = 0.
He was more asking about the other direction, i.e. whether he could conclude from T(something) = 0 that something = 0. This is guaranteed if and only if the linear map T is a one-to-one map. If T is not one-to-one, it'll have a non-trivial kernel.

#### leehuan

##### Well-Known Member

$\bg_white \noindent We'll show the contrapositive, i.e. that if S is dependent, then R is dependent (we'll do it for arbitrary n elements, since it works the same).$

$\bg_white \noindent Suppose S is dependent, then there exist scalars \alpha_{1},\ldots, \alpha_{n} not all 0 such that \alpha_{1}\mathbf{v}_{1} + \cdots + \alpha_{n} \mathbf{v}_{n} = \mathbf{0}. Apply T to both sides, using linearity on LHS and T\left(\mathbf{0}\right) = \mathbf{0} on the RHS, then we have \alpha_{1}T\left(\mathbf{v}_{1}\right) + \cdots + \alpha_{n} T\left(\mathbf{v}_{n}\right) = \mathbf{0}. So we have expressed \mathbf{0} as a non-trivial linear combination of the elements of R (since not all \alpha_{j} were 0), whence R is a linearly dependent set. This completes the proof.$
I see, really nice

Is there a way to do it in a forwards manner though? Because I asked it for a friend and I'm not sure if, in their course, they need to consider the contrapositive

If it's probably too hard then maybe don't worry

#### InteGrand

##### Well-Known Member

I see, really nice

Is there a way to do it in a forwards manner though? Because I asked it for a friend and I'm not sure if, in their course, they need to consider the contrapositive

If it's probably too hard then maybe don't worry
Not really, because to conclude just from T(something) = 0 that something = 0 (which is what you seemed to want to do), we'd have to know T was one-to-one, but T can be any linear map (and the result holds regardless). You'd end up having to do it by contradiction I think (which makes it similar to contrapositive).

#### leehuan

##### Well-Known Member

Not really, because to conclude just from T(something) = 0 that something = 0 (which is what you seemed to want to do), we'd have to know T was one-to-one, but T can be any linear map (and the result holds regardless). You'd end up having to do it by contradiction I think (which makes it similar to contrapositive).
Fair enough, alright sweet no worries

#### Flop21

##### Well-Known Member

Finding the basis for the kernel of this matrix (already row reduced):

The answers show (sorry you can barely see):

My understanding to solve this you make the non leading column = lambda, so e.g. x3 = lambda. Then you solve the equations.

But what happens when you have multiple non-leading columns like here? I don't understand what they've done in the answers, and why they have made two things = lamda, and one other = ,u.

#### Drsoccerball

##### Well-Known Member

Finding the basis for the kernel of this matrix (already row reduced):

The answers show (sorry you can barely see):

My understanding to solve this you make the non leading column = lambda, so e.g. x3 = lambda. Then you solve the equations.

But what happens when you have multiple non-leading columns like here? I don't understand what they've done in the answers, and why they have made two things = lamda, and one other = ,u.
If you have two unknowns in an equation let one of them equal to a variable.

#### Drsoccerball

##### Well-Known Member

Fair enough, alright sweet no worries
But we don't do contrapositives in 1241,1251...

#### seanieg89

##### Well-Known Member

Have they explicitly said you cannot argue by contrapositive/contradicition/etc? These are just fundamental methods of proof, not specific knowledge that may or may not be in syllabus.

Your lecturer might not specifically mention them but I would be extremely surprised if you are not allowed to use them. Sometimes proofs are just more concise etc when structured in this way.

#### He-Mann

##### Vexed?

But we don't do contrapositives in 1241,1251...
I remember doing proof by contradiction. I also remember an interesting discussion on mathstackexchange about proof by contradiction and contrapositive.

But like, if you have a super logical mind, then you'll recognise contrapositive. Consider: https://en.wikipedia.org/wiki/Wason_selection_task

#### Drsoccerball

##### Well-Known Member

Have they explicitly said you cannot argue by contrapositive/contradicition/etc? These are just fundamental methods of proof, not specific knowledge that may or may not be in syllabus.

Your lecturer might not specifically mention them but I would be extremely surprised if you are not allowed to use them. Sometimes proofs are just more concise etc when structured in this way.
It may just be unfair that we know a method some normal maths students wouldn't know but let's not take the risk.

#### leehuan

##### Well-Known Member

Have they explicitly said you cannot argue by contrapositive/contradicition/etc? These are just fundamental methods of proof, not specific knowledge that may or may not be in syllabus.

Your lecturer might not specifically mention them but I would be extremely surprised if you are not allowed to use them. Sometimes proofs are just more concise etc when structured in this way.
Nah they're definitely allowed. It's only the unfairness factor in that not everyone would think in that direction due to zero exposure to it.