Open Side Menu Go to the Top
Register
The Official Math/Physics/Whatever Homework questions thread The Official Math/Physics/Whatever Homework questions thread

11-28-2012 , 11:31 PM
Use "/left(" and "/right)" to make your parenthesis the correct size.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 12:56 AM
Yes although be careful with +- signs when taking square roots and also try to eliminate one of the 3 functions to get an equation with the other 2 say a,b only. I was joking trying to see if it an be solved or if it turns out a tough one, justifying the term "punishment".

When you took the derivative of the first one you can in principle completely eliminate say c in the derivative equation by using the original squares one to express c. You can do that in the second one too and now you have 2 equations of just a,b in them and their derivatives. After that it gets tricky tough or its not easily solvable at all. It would be funny if it can be solved nicely with a trick though...

I was trying to see how restrictive your original Pythagorean "velocities" equation was in the space of Pythagorean functions a^2+b^2=c^2.


If all fails at least find a special solution that is non trivial (other than the a,b,c all constants).
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 02:42 AM
For example what you said is true if you start with

a(t)=A*t^y , b(t)=B*t^y, c(t)=C*t^y if

A^2+B^2=C^2, constants, y any number.

Thats an easy example. A general case solution would be interesting and a good fun problem especially if a trick can solve it.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 04:35 AM
Reading about quadratic forms... Am I mistaken in the interpretation that in order to diagonalize the quadratic form of a symmetric matrix A, you can ALWAYS express it as P'DP where P has the eigenvectors of A in its columns and D has the eigenvalues on the diagonal?

The book talks about a change of variables... All that means is switching from the standard basis to a basis of eigenvectors, right?
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 05:58 AM
Quote:
Originally Posted by klonpucko
Reading about quadratic forms... Am I mistaken in the interpretation that in order to diagonalize the quadratic form of a symmetric matrix A, you can ALWAYS express it as P'DP where P has the eigenvectors of A in its columns and D has the eigenvalues on the diagonal?

The book talks about a change of variables... All that means is switching from the standard basis to a basis of eigenvectors, right?
Yes read on the diagonalization section here and apply as needed for a symmetric matrix ( ie left or right eigenvectors)

http://en.wikipedia.org/wiki/Matrix_...iagonalization
(P' is the inverse of P in your notation and P is the right eigenvectors matrix)


http://en.wikipedia.org/wiki/Left_ei...t_eigenvectors
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 07:15 AM
Thank you Masque.

Before I forget I might as well ask about something from my calculus book that I don't get. It says that all quotients of continuous functions are also continuous. But tan(x) is a quotient of two continuous functions and has an infinity of asymptotes. What's going on there?
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 08:12 AM
"continuous" means continuous on its domain
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 08:24 AM
Quote:
Originally Posted by masque de Z
For example what you said is true if you start with

a(t)=A*t^y , b(t)=B*t^y, c(t)=C*t^y if

A^2+B^2=C^2, constants, y any number.

Thats an easy example. A general case solution would be interesting and a good fun problem especially if a trick can solve it.
As of course is true for anything like;

a(t)=A*f(t) , b(t)=B*f(t), c(t)=C*f(t)
and
A^2+B^2=C^2 (where f(t) has derivative)

Lets call all these the trivial solutions.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 09:36 AM
Quote:
Originally Posted by Cueballmania
Use "/left(" and "/right)" to make your parenthesis the correct size.
I'm new to mathtype, do you mean literally type that statement in? When I did that nothing appears to change.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 09:47 AM
\left( and \right) make the parens the size of the text
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:30 AM
Quote:
Originally Posted by Wyman
\left( and \right) make the parens the size of the text


Uploaded with ImageShack.us

I enter the top one into mathtype and the bottom one comes out in word.

Edit to add that mathtype changes the font color to red
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:36 AM
The other guys are talking about LaTeX. I dunno about Mathtype, but seems as tho it haven't got the same function built in. But Mathtype should be able to do it as well somehow.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:38 AM
In LaTex:

Without \left(, \right):



With:



The latter looks prettier.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:44 AM
Your editor isn't recognizing the tex.


generated by:

\left( \frac{a}{b} \right)

(\frac{a}{b})
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:44 AM
Quote:
Originally Posted by acehole60
The other guys are talking about LaTeX. I dunno about Mathtype, but seems as tho it haven't got the same function built in. But Mathtype should be able to do it as well somehow.
Aaah, that makes sense. Yeah mathtype here.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:47 AM
Yeah I don't know how (if at all) to insert pure tex into your editor. I am seeing reference online to a resize parentheses menu though.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 10:58 AM
Quote:
Originally Posted by Acemanhattan
Aaah, that makes sense. Yeah mathtype here.
Yeah, I figured. I don't know much about Mathtype and I'm not sure about your future plans, but I'd consider learning LateX (or atleast something LaTeX-based) if I were you and were planning on doing a lot of stuff like this (exercises, papers etc).
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 05:17 PM
Z is a standard normal random variable, Y1=Z Y2=Z^2. Trying to find E[y1Y2](hint is = E[Z^3]

I don't understand this step of the solution



z^2 is becoming the variable so you can split and cancel themselves out but why does z^3 become Z^2/2 when you transform like that? some kinda jacobian?
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 05:21 PM
This is just "u-substitution" or "integration by substitution" (or the chain rule, really). You've made the substitution x = z^2, so dx = 2 z dz, so you can sub out that dz for dx/(2z), which is where your "divide by 2z" that you're observing on the z^3 term comes from.
The Official Math/Physics/Whatever Homework questions thread Quote
11-29-2012 , 05:32 PM
oh okay I wouldn't have thought to do that but it works out nicely
The Official Math/Physics/Whatever Homework questions thread Quote
11-30-2012 , 06:03 PM
Quote:
Originally Posted by bobboufl11
oh okay I wouldn't have thought to do that but it works out nicely
Also notice you already know the answer is zero because you integrate an antisymmetric function (with respect to 0 from -inf to +inf) since the exp(-z^2) one is symmetric and the z^3 is antisymmetric ie f(-z)=-f(z) and the 2 integrals cancel each other from -infinity to 0 and from 0 to +infinity.

In general use tricks like that to simplify things because you know that the x^n*Exp(ax) type integrals are doable with partial integration tricks so any change you can do to recreate outside the z^2 term in the differential takes you from exp(-z^2) form to exp (-w) and then its doable if the expression in front of the resulting exponential is not a fractional power. (its doable even then but only in the infinite limits of integration sense to obtain "easy" closed form results).
The Official Math/Physics/Whatever Homework questions thread Quote
12-01-2012 , 02:05 PM
a bit stumped here on an analysis question:

We've proved the "sequential criterion for functional limits":

lim f(x) as x-->c (c a limit point) = L

is equivalent to

For all sequences (Xn) with Xn != c and (Xn)-->c, f(Xn) = L

Given that lim f(x) as x-->c = L (c a limit point)

WTS: lim kf(x) as x-->c = kL for all k in R.

In the text, Abbott says these follow from the sequential criterion for functional limits and the Algebraic limit theorem for sequences.

I'm having trouble making the connection. Given that lim f(x) x-->c = L, we can recast this as:

For all sequences (Xn) with Xn != c and (Xn)-->c, f(Xn)-->L

So we now have a sequence, f(Xn) that converges to L. So I want to use the algebraic limit theorems to finish the proof.

If f(Xn) converges to L, the algebraic limit theorems tell us that k*f(Xn) converges to kL.

Does this look right?
The Official Math/Physics/Whatever Homework questions thread Quote
12-03-2012 , 08:41 AM
game theory...
Twenty players participate in the game called ”Greed”. The
game is as follows: All players are asked to simultaneously submit a bid between 1 and
1 000 000 in whole numbers. Then the player receives the payment equal to his bid if
and only if his bid is considered as ”non-greedy”, otherwise she stays with empty pockets.
The non-greedy bid is such that no other bid is smaller and there is at least one higher
bid. Find all Nash equilibria in pure strategies. Based on your result, is there a real
opportunity for players to get a lot of cash if all players are rational?
The Official Math/Physics/Whatever Homework questions thread Quote
12-03-2012 , 08:51 AM
everyone bids 1
one guy bids >1, the others 1
The Official Math/Physics/Whatever Homework questions thread Quote
12-03-2012 , 11:47 PM
Linear Algebra... Finals approaching.
Prove, or provide a counterexample:

Any set of nonzero mutually orthogonal vectors is independent.
I'm 99% sure it's true.

I'm having a little difficulty proving independence abstractly. I know that, orthogonality of two vectors, (e.g., if u and v are column vectors) is defined as u * v^T = 0. And I suspect I will need to prove it by showing that a matrix made up of the vectors is of full column rank. But, I'm just having difficulty getting there!

Any advice is appreciated.

Thank you!
The Official Math/Physics/Whatever Homework questions thread Quote

      
m