Xero Scythe said:
I was always taught it was both. Imagine it on a numberline and you'll understand. 5/6 is smaller than 5/3, because the denominator (The 6 and 3, respectively) was smaller in the second fraction. Now, as the denominator gets closer to zero, the bigger the actual final answer will be. 1/2 becomes .5, but 1/.5 is 2, and so the numbers get bigger and bigger...
But then you reach zero. Because zero is the smallest number there is (considering negative numbers work in basically the same way as their absolute value counterparts), the final product stretches on into infinity, which humans cannot count to. Hell, a human will die before reaching one billion! Because of this, we use the zero with a line through it, the sign for undefined/divide by zero, clear mathmatical language saying "Whoever made this problem really fucked up."
The probem with x/0 = infinity is that infinity isn't a number, and doesn't act like one, so letting somebody stick infinity in their equation when they need to divide by zero will just result in weird, useless answers. What's infinity + 1? Well, it's still infinity, so...
1 = 1
1 + infinity = 1 + infinity
/add infinity to both sides
infinity = 1 + infinity
/infinity + 1 is still infinity
0 = 1
/subtract infinity from both sides
Since you can't treat infinity like a number, defining a number being divided by zero as infinity doesn't do anything useful for you. You still can't get any meaningful answer from your equation.