theriddlen said:
Also, to get this system fully working i had to ask on forums. OSX and Windows do not required me to do so.
And learning something new is bad, how?
Linux requires you to learn and understand the system you're using. If you prefer not to use that organ behind your eyes, and rather pay someone else to handle the technical details (Microsoft, Apple, etc); then you should go with other OSs that better meets your needs.
Linux's initial learning curve is high because its different. Users can't just dive right in without thinking. But its rewarding in the long term...I don't spend a dime on software any more. Nor do I pirate software. I know how to lock down a system. Set up an entire computing infrastructure for a business. Kiosks for a local library. Modify it to meet my needs. Build servers and firewalls to protect a network for a non-profit organization who can't afford an expensive commercial class firewall/gateway, etc...Essentially use it in ways that helps others. My system up-times are counted in the months to years category. (And it doesn't hurt that I can make a little money on the side as well!)
OSX/Windows's learning curve is low, but you pay for it in other ways.
For example:
* Windows is limited by the licensing conditions dictated by Microsoft. Educational use is different to MSDN use. Which itself is different to TechNet use...Which is different to OEM and Retail uses. And there are more conditions if you want to use the supercomputing variant, etc.
* OSX is similar in that you're not supposed to run it on a system that wasn't made by Apple.
* The only condition (oversimplified) for Linux is that if you plan to modify the code, you must share what you've changed. There's no "Genuine Advantage" to enforce how many copies I can use or licensing conditions based on the user's purpose. I don't have to worry about making sure my copy of the OS installed on my system is legit or pirated. I can install it as many as I want, where ever I want.
* Windows and OSX...Their respective companies are responsible for how the system turns out. Linux? The user is responsible. Whether you want this control of responsibility is up to you. But you have to accept that if Microsoft or Apple implements a change you don't like; you're stuck with it. Like it or lump it.
* Windows/OSX users don't learn about preventing infections or system compromise. As a result, they are susceptible to FUD from AV companies (which sell them ineffective cures) and social engineering from malware writers...In Linux, people understand and learn what causes a security issue. They then can apply proper fixes to the affected application or apply prevention methods until a fix arrives. (User knowledge and experience is key to security in computing...And the majority of the world's computing users don't score high in that category.)
* Microsoft and Apple can choose not to fix a security issue for whatever reason. The clearest examples I know was with Windows 2000 (before support period expired) and XP (still supported until 2014). MS's own security bulletins openly admit that they won't or can't fix a problem because it will require a major re-write to the affected component. Effectively, you're on your own. They suggest to use a firewall to block the affected ports of infection. Wow! Way to look after your paying customers there Microsoft!
Like life, operating systems aren't a free lunch. There are some pretty glaring Pros/Cons to each. Many of them aren't even noticed by people.