What's new

Rebuild The Internet ????

Researchers explore scrapping Internet


Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

The Internet "works well in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."
One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."

The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.

The European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.

"The network is now mission critical for too many people, when in the (early days) it was just experimental," Zittrain said.

The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.

But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn't have built-in mechanisms for knowing with certainty who sent what.

The network's designers also assumed that computers are in fixed locations and always connected. That's no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.

Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.

Workarounds for mobile devices "can work quite well if a small fraction of the traffic is of that type," but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford's clean-slate program.
The Internet will continue to face new challenges as applications require guaranteed transmissions — not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.

Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an Internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.

And one day, sensors of all sorts will likely be Internet capable.
Rather than create workarounds each time, clean-slate researchers want to redesign the system to easily accommodate any future technologies, said Larry Peterson, chairman of computer science at Princeton and head of the planning group for the NSF's GENI.

Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the get-go. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.

"We made decisions based on a very different technical landscape," said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.

"Now, we have the ability to do all sorts of things at very high speeds," he said. "Why don't we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?"
Of course, a key question is how to make any transition — and researchers are largely punting for now.

"Let's try to define where we think we should end up, what we think the Internet should look like in 15 years' time, and only then would we decide the path," McKeown said. "We acknowledge it's going to be really hard but I think it will be a mistake to be deterred by that."

Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.

"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.
Think evolution, not revolution.

Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.
These days, Carnegie Mellon professor Hui Zhang said he no longer feels like "the outcast of the community" as a champion of clean-slate designs.
Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.
FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.

These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.

Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf's TCP/IP communications protocols.

"Almost every assumption going into the current design of the Internet is open to reconsideration and challenge," said Parulkar, the NSF official heading to Stanford. "Researchers may come up with wild ideas and very innovative ideas that may not have a lot to do with the current Internet." ___
 
Researchers explore scrapping Internet


Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.


Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."
One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."

The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.

The European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.


Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.

Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.

"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.
Think evolution, not revolution.

a "clean slate".... so they can be more involved?? Yikes... "does not satisfy all needs" ..their needs, or ours.... I don't remember being asked.

Holy crap. Not only will this thing cost us a fortune, but then the government can watch every thing that we do. No thanks, I like how it is now, useful, innovative and improving without the help of the omnipresent you know who over our shoulders..... I realize that some kind of government involvement may have created the internet in the first place, but as always, accidents happen...:rolleyes: :rolleyes: :rolleyes:

I think there is something they aren't going public about...

Damn straight.

you mean like how can we tax this thing ??? :mad3:

Bingo. I just hope in 2020 we have Firefox 9 with the latest add on---crazy government pop up blocker....

this sucks. Who invited them to the party?:001_unsur
 
Holy crap. Not only will this thing cost us a fortune, but then the government can watch every thing that we do. No thanks, I like how it is now, useful, innovative and improving without the help of the omnipresent you know who over our shoulders..... I realize that some kind of government involvement may have created the internet in the first place, but as always, accidents happen...:rolleyes: :rolleyes: :rolleyes:

Anybody who thinks that they are online with any sort of anonymity is only deluding themselves. Everybody leaves tracks which can eventually be traced back to the source. That is especially true if it is your own personal ISP account.

As far as improving goes, that's the whole point of the overhaul. The basic design needs to be changed so that so the Internet can be more reliable and useful in the future. Part of the changes very well could do away with spam entirely, along with removing or limiting threats from viruses, trojans, and malware. That can't happen with the current design.
 
Improvements = taxation. Tax him on what he earns. Tax him on what he spends. Tax him on what he owns. Tax him all over again when he dies. Soon we can insert "Tax him on the dirty pictures he looks at" prior to death.
 
If it has the government's blessing, be afraid. Be very, very afraid. Especially if the current setup "does not satisfy all needs".
 
Anybody who thinks that they are online with any sort of anonymity is only deluding themselves. Everybody leaves tracks which can eventually be traced back to the source. That is especially true if it is your own personal ISP account.

As far as improving goes, that's the whole point of the overhaul. The basic design needs to be changed so that so the Internet can be more reliable and useful in the future. Part of the changes very well could do away with spam entirely, along with removing or limiting threats from viruses, trojans, and malware. That can't happen with the current design.

I am nowhere near thinking that the current set up is "free" from "observation".... But to make another internet with their specifications? That is just too tempting for all those "well intentioned" folks.

I'll take the trojans, malware, adware, spyware and everything else we deal with now over another system bought and paid for by my taxes to replace what a) already works b) is improving and mostly c) will end up costing 10X what it should d) be "filtered" for our "protection" and "tracked" for our safety e) Will end up more buggy/infected than what we have now and, f) soon to be antiquated equipment and more indispensable 36 hour a week 90k a year "analysts" that do "busy work" all day, protecting their jobs by being inefficient and spending their budgets entirely every year.

Improvements = taxation. Tax him on what he earns. Tax him on what he spends. Tax him on what he owns. Tax him all over again when he dies. Soon we can insert "Tax him on the dirty pictures he looks at" prior to death.

If you drive a car, I’ll tax the street,
If you try to sit, I’ll tax your seat,
If you get too cold, I’ll tax the heat,
If you take a walk, I’ll tax your feet. :sad: :sad: :sad:

If it has the government's blessing, be afraid. Be very, very afraid. Especially if the current setup "does not satisfy all needs".

that really was the most disturbing quote that i saw from the article. I just wonder, who's "needs" they think the current internet is shortchanging? Pretty obvious. Pretty damn scary.
 
Just hope they won't say Microsoft will do the developing of the new software and security of the "new Internet"...
 
So is this already in the works? or is it a "What we should start working on..."

Will there still be "old internet"(what we are using now)? For instance, can I take my chances of getting trojans/viruses/spyware/etc with this slow, outdated internet instead of this new, "faster, safer" internet they are working on?
 
It's not a hoax. Several organizations are working on the concept, which includes The National Science Foundation, Carnegie Mellon, Princeton, Rutgers, Stanford and the Massachusetts Institute of Technology.

----
Design concerns

Government and university researchers have been exploring ways to redesign the Internet from scratch. Some of the challenges that led researchers to start thinking of clean-slate approaches:
SECURITY

  • THE CHALLENGE: The Internet was designed to be open and flexible, and all users were assumed to be trustworthy. Thus, the Internet's protocols weren't designed to authenticate users and their data, allowing spammers and hackers to easily cover their tracks by attaching fake return addresses onto data packets.
  • THE CURRENT FIX: Internet applications such as firewalls and spam filters attempt to control security threats. But because such techniques don't penetrate deep into the network, bad data still are passed along, clogging systems and possibly fooling the filtering technology.
  • THE CLEAN-SLATE SOLUTION: The network would have to be redesigned to be skeptical of all users and data packets from the start. Data wouldn't be passed along unless the packets were authenticated. Faster computers today should be able to handle the additional processing required within the network.
MOBILITY

  • THE CHALLENGE: Computers rarely were moved, so numeric Internet addresses were assigned to devices based on their location. A laptop, on the other hand, is constantly on the move.
  • THE CURRENT FIX: A laptop changes its address and reconnects as it moves from one wireless access point to another, disrupting data flow. Another workaround is to have all traffic channel back to the first access point as a laptop moves to a second or a third location, but delays could result from the extra distance.
  • THE CLEAN-SLATE SOLUTION: The address system would have to be restructured so that addresses are based more on the device and less on the location. This way, a laptop could retain its address as it hops through multiple hot spots.
UBIQUITY

  • THE CHALLENGE: The Internet was designed when there were relatively few computers connecting to it. The proliferation of personal computers and mobile devices led to a scarcity in the initial address system. There will be even more demand for addresses as toasters, air conditioners and other devices come with Internet capability, and as standalone sensors for measuring everything from the temperature to the availability of parking spaces become more common.
  • THE CURRENT FIX: Engineers expanded the address pool with a system called IPv6, but nearly a decade after most of the groundwork was completed, the vast majority of software and hardware still use the older, more crowded IPv4 technology. Even if more migrate to IPv6, processing the addresses for all the sensors could prove taxing.
  • THE CLEAN-SLATE SOLUTION: Researchers are questioning whether all devices truly need addresses. Perhaps sensors in a home could talk to one another locally and relay the most important data through a gateway bearing an address. This way, the Internet's traffic cops, known as routers, wouldn't have to keep track of every single sensor, improving efficiency.
----
Some links:
cleanslate.stanford.edu

100x100network.org

orbit-lab.org

geni.net
 
This sounds like one of those email hoaxs that run around the internets. Are you sure this is real?

This has been talked about for years....Back when you had to use the "Slip" command to get onto the original "jugghead,veronica, and archie"...systems it was talked about.

Our government has for years been trying to figure out how to tax the internet..and how to tax goods on the internet...

Most people seem to think that what will happen is once we go to all digital systems...with fiber optics thing will change. It has been talke about that what might actually happen is that there will be "hubs" built around the country and all traffic will flow through these hubs and all traffic will be coded so that if you send a file or email from CA to NY City..there will be a tax because it will have to so to "x" amount of hubs..Kind of like a toll road.

Right now you can proxy around the globe or if you really know what you are doing you can bounch files from different ports hitting everything from Telnet to Http...And if you do it right the trail gets fuzzy enough that it can be impossible to track. ....Think of it like this...

I have a file that is broken down into smaller files..think of a pack of life savors...the one's with colors...Now the first bit of candy and the last bit of candy in the candy are the markers....

So I send the package of candy in each little piece..in different directions and
through different "roads"..kind of like Telnet...FTP ...HTTP...HTTPS....or different ports..

When they arrive at the ending location....you have a pile of different color candies.....if you know what the first color or marker and the last..then you can put them back into the package the way they were originally sent..and the file is now back to whatever it is..

If you look at this little example you can see why the different governments want to find a way to track information better...

The only real problem even if this happens is....everytime someone comes up with a new way to block or keep people out ...someone else finds a way around it...

Since everything you look at on the internet is just 1's and 0's it is very easy to send information to people and have no one know where or to whom it is going to...

All it really takes is posting a picture on a free photo site.........
 
Top Bottom