That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at devContainers as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….
devContainers are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDE
But you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at
devContainers
as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….devContainers
are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDEBut you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Maybe don’t rely on cloud garbage for basic development?
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
Unless you’re going to host your own YouTube…
This is exactly what peer tube is struggling with. This bandwidth would solve the video federation problem.
See, you get it!