Is there more of a risk that I will lose money if I invest in a stock that has a 1% expected rate of return (cost of equity) than there is if I invest in a stock with a 15% expected rate of return?
I'm expected to make less money, sure, but am I equally likely to lose money on the investment?
It's important to remember that "expected rate of return" has no bearing on the probability that the stock will increase in value. This is generally because the concept of "expected rate of return" on a volatile security is mostly bogus.
Let's say we had perfect information about the probability of the stock increasing (let's say it's 60%). Then, there's a 40% probability of the stock staying at the same value or decreasing.
But, there's no way to calculate the probability so it's just a farce.
So does CAPM (or any modified, more advanced method that might be similar) really help an individual investor?
The idea is just that if the stock is going to increase, it would be expected to increase by the rate of the expected rate of return?
And following up on that, would a measure of the volatility of a stock (beta) be any good indicator (at least in the short term, not long term) that a stock is going to increase rather than decrease or stay the same?
Like would looking a security market line be of any help in deciding whether a stock is likely to make me any money (regardless of small that much might be) ?
Risk and expected rate of return can be totally unrelated. Think of the distributions visually, the 1% could be a very tight distribution and the 15% could have very long tails. This is speaking mathematically.
If you're looking at fundamentals, think of risk as being related to earnings, ROE and debt levels, as well as other common sense evaluations.
Time preference also matters.