Please read the article before commenting, because I find the proposed solution a bit worrisome.
Of course we should secure IoT, but the article is about one very particular kind of security: roots of trust. The idea is that devices shouldn't run unsigned software, so forget about custom firmwares, and generally owning the hardware.
There is a workaround, sometimes called "user override", where the owners can set their own root-of-trust so that they can install custom software. It may involves some physical action, like pushing a switch, so that it cannot be done remotely by a hacker. But the article doesn't mention that, in fact, it especially mentions that the manufacturer (not the user) is to be trusted and an appropriate response is to reset the device, making it completely unusable for the user. Note that such behavior is considered unacceptable by GPLv3.
There are some cases where it is appropriate, GPLv3 makes a distinction between hardware sold to businesses and "User Products", and I think that's fair. You probably don't want people to tinker with things like credit card terminals. But the article makes no such distinction, even implying that consumer goods are to be included.
Not only that, "roots of trust" and locking users out of their devices is the thing that causes the IoS omnishambles. The foundational problem is that some company makes millions of devices and then goes out of business or otherwise stops supporting them, but because the users are locked out of the device, nobody else can do it either. Meanwhile people continue to use them because the device is still functional modulo the unpatched security vulnerabilities.
If anyone could straightforwardly install the latest DD-WRT or similar then it's solved, because then you don't have to replace the hardware to replace the software, and the manufacturer could even push a community firmware to the thing as their last act before discontinuing support.
> and the manufacturer could even push a community firmware to the thing as their last act before discontinuing support.
This should be held in escrow before the device can be sold. And the entity doing the escrow service should periodically build the software and install it onto newly-purchased test devices to make sure it's still valid.
If the company drops support, either by going out of business or by simply allowing issues to go unaddressed for too long, then the escrowed BSP/firmware is released and the people now own their own hardware.
That seems like a lot of complicated when the better solution is to have the people own their hardware from day one.
You also need the community around the device to already exist on the day support is discontinued instead of needing to build one then around a device which is by that point years old and unavailable for new purchase.
I really thought escrow for software would've been SOP by now.
We made EMRs in the 2000s. Our customers required everything to be placed in escrow. It seemed abundantly prudent to me.
Maybe even prescient; our startup was bought, then murdered in its crib, leaving our customers SOoL. But at least they got the source.
The issue is as much companies going out of business as consumers buying devices from shit companies.
We need schemes which enforce security and which make long term economic sense. I would require software escrow for all companies to ensure a bankruptcy doesn't mean all software is lost.
A solid 90% of the problem is that hardware companies think that somebody actually wants their software. Hardware vendors are bad at software. They should not attempt to make software. They should make hardware with the expectation that customers will install whatever software they want on it, and then throw some open source code straight from github on it for the customers who expect it to do something right out of the box.
Their code is bad. It should not be used. They should not even write it to begin with. Just ship the device with existing open source code with the minimum -- and published -- modifications to make it run on your device, and focus on being a hardware company.
Isn’t this what happened with the Ender 3 pro and people griped about it?
People gripe about everything. Don't care about it unless their complaints have merit, or if they do then fix the problem.
This is exactly the approach of Pine64.
How user antagonistic changing code on IoT devices should be is highly dependent on the threat model for the devices. I'm happy to trust home users to flash their lightbulbs and door locks (though the company might not see that as acceptable to their brand reputation if their lock is compromised nonetheless), but I would prefer not to trust the hundreds of IT departments and engineering teams to properly vet the code they are flashing onto industrial control systems when lives are at stake - centralized authority and accountability with high visibility on the code base that is flashed to the devices is what is needed there.
I would add that root of trust secures against rare, advanced attacks like the "evil maid" or supply chain attacks. You should worry about that after you've already secured against basic vulnerabilities which 90% of IoT devices have not done.
I completely agree, and it disappoints me greatly to see articles like this because it advocates doing what the manufacturers already want to do so they can increase their control. It's the exact type of article that people who want to lock things down will use to discredit or disagree with someone advocating a more ethical approach.
- [deleted]