Let's keep in mind that my main point is that we need to start viewing the idea of having a back door and keeping it safe as a problem to try to resolve. Not just sticking with the claim that it would instantly be broken.
We really just need to plan for ways to delay hackers by 5+ years for it to be a total success.
chimeracreator said:
Lightknight said:
1. What if the key generator remains on an encrypted partition that itself has no back door? Bit Locker doesn't have to encrypt an entire drive and to be honest at multi-terabyte drives that could take a hell of a long time to do. But it can encrypt folders and partitions without encrypting the rest of the drive. The 2nd and 3rd factor authenticators would also be stored on devices with no back door in their encryption either.
The security of the key generator isn't the primary problem with this scheme. This is part of Kerckhoffs's principle. Also to run the generator you would still need to decrypt it, and if the generator was in constant use it would need to stay decrypted. As such it could be harvested from memory if the host was compromised, which is a technique that's widely used to steal credit card numbers already.
Kerckhoff's principle. Fantastic, I'm speaking to the right sort of person.
While it could hypothetically be harvested from memory, please understand that something like a bit-locked device can't really be harvest in the same way that a non-encrypted device is. I keep using bit-locker because this is the form of encryption that I'm most familiar with. The only case study I've seen is where they had a bit-locked device cracked was a scenario where the device was already set up in a lab and the encrypted drive was currently open so they could hard freeze it rapidly to obtain the key from the memory. This would be unrealistic in the real world and would only help you with one device rather than all of them. Even then, the multi-factor keys wouldn't be in memory because the user wouldn't be accessing the phone via that method, they'd be using their standard password or retinal scan or whatever else. Let's not forget that even with a non-back door encrypted phone that the hard freeze technique would still reveal the user password in memory. The user-password that is. But you still basically have to have the person enter the information and then freeze it right then. Not very useful.
Also, let me pose a question to you on the WoW two factor authentication process. Why hasn't that been cracked? Surely there would have been a lot of money in it just like there was to hack peoples' accounts.
Lightknight said:
2. Why would they be able to remove the two and three factor authentications if those are required to gain access to the phone in the first place? Hypothetically, even if you already knew the key you'd only have one of three or more keys. Hell, Apple could make them have to undergo a hundred factor authentication process and really gum up people's ability to rip them off. It could even be crazy where the employee has to apply the machine factor keys in a specific order.
I'm not exactly sure what you're saying here. Authorization controls normal access to a system. Direct physical access allows you to bypass authorization layers, which leaves only encryption. If you're talking about adding authorization layers onto the server system which hosts it that can help, but that won't stop them from being a target for hackers.
I'm not talking about the regular encryption to get into the phone that a user uses. I'm talking about gaining access to the backdoor to that regular encryption. In order to activate the backdoor you'd have to first enter the key which would hypothetically start the phone in an alternate limited mode which would then require the multi-factor authentications to go further. There are all kinds of potential tricks they could add to it. Small things that would make a hacker's time a lot more difficult if they didn't work in this department for Apple and if Apple's employees didn't spill the beans. Like let's say the limited mode also gives you access to alter the time/date and no one tells you that you have to set the phone to a particular date before entering certain keys or something. I mean, they can make the whole process really convoluted. But those would only be good as long as your employees maintained silence on the matter. But again, as with Kerckhoff's principle the employees remaining silent would not be assumed.
Lightknight said:
3. I'm not talking about these things being given to the government. I'm talking about Apple and Microsoft and whomever else having a department like this and being able to bill the government for their services of unlocking these phones. I think it's totally bogus for the government to force anyone to do a job. The government can't enact temporary slavery just because they don't know how to get into a phone. Hypothetically, this would also produce a side business for these companies if someone gets locked out of their own phone and can prove identification.
These companies honestly have a hard enough time securing themselves against everyone who wants to hack them now. They don't want to add an even more tempting target on their backs.
These companies have the source code for everything they make already in their networks right now. If they get hacked and lose vital stuff like that then there's no helping them from hackers. This would make them no less secure or no more a target for hackers. It just means that if they are successfully hacked, that what hackers get would be easier than getting a source code to reverse engineer later.
But again, this department should be non-network or internet facing. It should be its own work group or maybe the machines serving as the multi-factor authentication shouldn't even be connected to one another directly either. So to hack them you'd have to do so on site and one at a time (or multiple connections at once). If I were to set something like this up my two primary responsibilities would be to ensure that the machines are only accessible within the rooms where phones would be opened and no where else and also to monitor my employees vigilantly.
Make no mistake though, what I'm proposing Apple do would be a business service that they'd be paid to do. So even if they are incurring additional risk, it's no different than any other venture they might start up.
Lightknight said:
4. I know you say they don't respect physical boundaries. But all I'm saying is that in addition to email based two factor authentication, they could also set up entire machines that are fully encrypted (again, no back doors present on these key machines) that themselves have to shake hands with the phones. Machines that are never exposed to the internet and generally only shake hands with these phones. That could present a markedly more difficult challenge to spoof in that you'd literally have to be one of a handful of people who maintain these machines that could have any kind of serious access to them.
Again, this has to do with the authentication phase to get the master key not encryption.
Let's talk hypothetically for a second. Let's say that all bit locked hard drives had two keys. Whatever the user sets up and a key word that Bill Gates knows.
If Bill Gates himself sat at a desk and entered said password himself whenever the FBI brought him a hard drive, how would that system be broken, hypothetically? The assumption being that he won't spill the beans.
With that in mind, imagine 3 or more machines which won't spill the beans because they can never communicate with the outside world that also know a key and only communicate it with a phone that is directly hooked into them and has already been given the necessary preceding keys and the algorithm log in?
Lightknight said:
5. The initial symmetric key is really just to start the communication. The idea is to assume that this will eventually get out but to make it hard enough that it should be years before that is reverse engineered. Hopefully years after the next phone has already replaced it and made un-encrypting it pointless. If you really think about it, our current market only needs about 3 years minimum to delay things. If they can get to 5 or more years then for all intents and purposes it will succeed.
Nope, the first symmetric key is for the phone itself. This is the only key which can actually decrypt data on the phone. Everything else is just a protection for it, including the fact that it is encrypted based on a much weaker user provided passkey.
I'm talking about the back door key I'm proposing in my hypothetical. It wouldn't be used to access the phone the same way the current user password does. It would be used to start the back door sequence. Once activated the phone then starts listening for the designated machines and their multi-factor keys.
The back door should be a different way into the phone. Not just a second way to enter the same path. I would assume similar principles of bricking the phone would apply if that back door was accessed incorrectly too.
Lightknight said:
6. Is there any secure way to update security on phones? My guess is since it is transmitted that it might be impossible. But if there are secure connections that can be made or perhaps if they have people go into physical store fronts to update then it's not unreasonable that every couple of years they could change the algorithm and factor keys for the security conscious.
In general yes, it is possible to update the security on phones. However the most secure systems available use purpose built chips for this and as such they cannot be upgraded, but also aren't vulnerable to a host of attacks that the iPhone can be hit with.
I was just wondering if it would be possible to change up the algorithm and expected multi-factor keys from time to time. If you have a system that cannot be reasonably decrypted for, let's say a year, it would be crafty to push out a new algorithm every 11 months to reset the time to crack clock.