Digital economy

e-book: “Social and Economic Factors Shaping the Future of the Internet”, NSF/OECD Workshop proceedings, 31 January 2007




This document contains a summary of the proceedings of an NSF / OECD workshop entitled "Social and Economic Factors Shaping the Future of the Internet", held at the United States National Science Foundation in Washington D.C., on 31 January 2007. The workshop was co-organised by the US National Science Foundation (NSF) and the Organisation for Economic Co-operation and Development (OECD), and was sponsored by the NSF. The event attracted some 35 speakers and 100 participants. This report was prepared by Ms. Karine Perset, with the participation of Dr. Sam Paltridge, both of the OECD's Directorate for Science, Technology and Industry. The views expressed in the summarised proceedings may or may not represent the views of the NSF, the OECD or the OECD’s member countries.

The workshop’s goal was to discuss strategic directions for the future of the Internet, from both a technological and a policy viewpoint. To this end, a group of experts 1, including economists, policy makers, social scientists and technologists, were brought together to consider a broad range of issues and questions relating to the future of the Internet. ExamplesAcknowlegements.







Measurement of the Internet



The difficulty of security and options for a more secure future
Privacy implications of the Internet and other distributed systems
Consumer empowerment and disclosure



Openness and participation
Participation of all, including the developing world
Openness and content production, distribution and access
Values, creativity and culture



Minimalist regulation approach versus critical infrastructure status
Globalisation and societal needs
Economic or social externalities
Investment in infrastructure
Convergence and Internet traffic exchange
Identification and addressing



Role of governments
Role of technologists
Co-operation of technologists and policy makers








Today’s Internet is the sum of all the private and public investment, activities, decisions, inventions and creativity of a billion users, over 23 000 autonomous systems, and countless creators and innovators.
In a relatively short time, the Internet has become a fundamental infrastructure for our economies and societies and, as a result, raises increasingly important policy issues across a broad range of economic and social dimensions. Three main trends are increasingly influencing the current Internet’s ability to meet the requirements of users:

  • The openness of the Internet has been a clear factor in fostering competition and innovation, and is increasingly enabling users to develop, collaborate and distribute content and customise applications. This openness is driving a range of new social and economic opportunities and deepening the role of the Internet as a key part of our economic infrastructure.
  • Security threats endanger network operation and a trusted online environment at a time when an increasing number and variety of communities and businesses critically rely on the Internet.
  • Convergence between previously distinct networks and services towards the use of the Internet protocol generates new demands on the Internet and places strains on existing regulatory models.


In considering the range of scenarios that relate to a future Internet, an array of choices can be made, in which technological evolutions may be partnered with social policy and regulatory discussions. To examine these choices and their potential implications, a dialogue between the technical community and the policy community is crucial, and should be informed by the work of economists, social scientists, and legal experts.


“The Future of the Internet is too important to be left to chance or random developments"
David Clark

Discussants at the workshop agreed that there was a critical necessity to design future systems to be as adaptive as possible to evolving needs – whether these needs are technical, economic, social or legal in nature – as opposed to solely reflecting current requirements. They agreed on the need to draw lessons from the applications and use associated with the evolution of the current Internet and to identify the features that have been critical to the Internet’s success and its openness/fostering of what several participants called “serendipity” or, as another participant called it, “generativity”. At the same time, participants realised that the current Internet faces many challenges as it evolves to embrace new requirements, which are not only related to existing technical limitations but also to economic, social and regulatory issues.

At the outset of the workshop, participants were reminded that the question of whether future needs may be accommodated by incremental improvement of the current design of the Internet, and/or by a clean-slate approach, is being investigated by the research community, partly within the framework of the GENI project, which is both an effort to solve current Internet problems five years into the future, as well as a longer-term effort to define requirements for the network 10 or 15 years from now. GENI will be designed for experiments with different architectures that enable an assessment of socio-economic impacts, e.g. different architectures might lead to more or less openness.


Transparency and data are needed to understand the workings of the Internet

One of the strongest themes that emerged from the workshop was the need for transparency and the need to gather data to better understand the workings of the Internet. Participants called for increased transparency on how Internet-related products and services are operated, e.g. the factors that may impinge on service quality, to help user/consumer trust. They also stressed that increased transparency through information sharing in Internet infrastructure markets could make investments more efficient. The need for measurement of traffic flows and inter-networking efficiency was particularly emphasised, in order to enable informed technical or policy decisions in respect to future options and priorities. There was a call for measurement of Internet traffic flows that would enable researchers, industry and policymakers to understand and quantify various aspects of the Internet, such as its size, areas and patterns of growth, or potential vulnerabilities. However, there was also a call for specification of what data is needed to inform what questions, and to respect commercial confidentiality.

The call for transparency and data was also related to the need for an objective assessment of the status of resources, such as numbering resources. The status of IP addresses, namely new allocations of IPv4 addresses and adoption of Ipv6, was highlighted as an issue that the OECD should examine from an economic perspective.


Building a trust framework requires both enhancing Internet security and better equipping end-users to deal with security problems

A second strong theme was that of building a framework around trust and trustworthiness, taking into account society’s need for privacy, security, consumer protection, and the potential of improved identity management systems. On security and trust, participants highlighted that many problems lie in the end-point computers or devices, due to infections by malware, and they forecasted that attacks on or from end-point computers would increase. In such a context, they stressed the joint necessity of striving to make interactions on the Internet “safe enough” for people to want to use the Internet while, in parallel, better equipping end-users to deal with security problems. Since trust requires reliable identity mechanisms, participants stressed the need to “tread carefully” with online identity mechanisms. They emphasised that personal data requested in a specific context should not be given to actors who would not be trusted in different contexts.


The privacy implications of the Internet for society are significant, with access to information about individuals’ location and presence (geolocation) being a growing concern

A substantial discussion on the privacy implications of the Internet for society took place at the workshop. Social networking sites, profiling, and advertising were discussed. It was stressed that in many cases, information is given voluntarily, but that users may not realise the quantity and durability of information collected about them. Participants highlighted an upcoming issue on who has access to location and presence (geolocation) information. The OECD Privacy Guidelines were referred to as a model to build on, but some of the Guidelines’ principles, such as consent and a centralised data controller, may be less easy to apply in areas such as social networking environments. There was also a call for more transparency and disclosure for consumer protection, on the functions that software or hardware run, and on Internet Service Providers (ISPs)’ policies regarding, for example, data retention.

All actors should beware not to “lock in” existing systems and should strive for interoperability and low transaction costs
A third theme stressed the value of the participative nature of the Internet, which allows widespread social and economic interaction, innovation, and value creation, and the importance of reinforcing policy frameworks that encourage participation on, and access to, the Internet. Participants felt that the current choices on openness and participation would impact the participative Internet of the future. They stressed that interoperability and low transaction costs were key enablers to online participation by individuals and to co-operative new business models based on the Internet. The consensus was that the future of computing and communications could take a variety of directions and that all actors should guard against locking in existing systems to the exclusion of innovation or other potential benefits.


Financing last mile connectivity is viewed as a major upcoming issue

A fourth theme related to communications infrastructure and policy. A recurrent issue throughout the workshop was how best to stimulate “creative destruction” and innovation in communications infrastructure, while at the same time creating an environment that supports investment. Participants viewed financing last mile connectivity as a major upcoming issue. They thought that disruption, even another boom/bust cycle, was possible as we add new requirements and connect many new devices to the Internet.


Some issues require policy intervention to better align incentives

A recurrent issue was the need to adapt regulation, including regulation on critical infrastructure, in a context in which heterogeneity, instability, and rapid change of the Internet industry structure make regulation difficult. There was agreement that markets were beneficial, and that market forces should be utilised in order to stimulate improvements in areas such as consumer choice. Dealing with economic or social externalities, such as those that relate to long-term social needs, was viewed as something that markets alone could not address. In particular, issues like the upgrade to IPv6 may require policy intervention to better align incentives with desired outcomes. Participants discussed models of government involvement, which could be multi-stakeholder co-operation, government contracting, liability, or regulation.


Articulating common principles for a global, shared infrastructure would be timely

Finally, the roles of different stakeholders were emphasised. The key role of governments and intergovernmental organisations was stressed in areas such as setting or codifying rules of the game, planning for the long-term or in respect to societal or economic impacts, and international co-operation. The need for multi-stakeholder co-operation was also emphasised. Participants agreed that going forward, a dialogue between the technical community and the policy community is crucial. Participants encouraged the OECD to develop common principles that could apply in the future environment, including those that are fundamental to the OECD, such as democratic, market-based principles.




“The Internet is changing from the inside out and from the outside in”
Marjory Blumenthal 4

From the inside, Internet technologies are in transition from an era of deployment and performance, to an era of qualitative evolution (expanding functionality), where a diverse range of environments enable communication in a variety of forms and situations. Primary technological trends include digital convergence toward the Internet Protocol (IP e.g. VoIP, IPTV), towards mobility (with e.g. mobile broadband), towards human-oriented applications (e.g. the participative web, often described as Web 2.0 or intelligent user interfaces), networked information technology with the web as the platform (with e.g. application service providers, Web Services, service-oriented architectures) and intelligent objects that can sense and control (e.g. RFID, home networks, or intelligent transportation systems). Mobile computing and embedded devices are expected to play a leading role in future computing and communication.


“Expect more devices, more bandwidth and more data --100, 1000, even 10,000 times more"
Michael R. Nelson

From the outside in, the Internet is now a critical infrastructure underpinning economic and social activity at a global level. Accelerating technological development in relation to the Internet has tremendous technological, political, social, and cultural ramifications that are difficult, or in many cases impossible to comprehend. The Internet is rapidly evolving into a broadband network-of-networks, with increasing fixed and wireless access, supporting around a billion users. In the future it is expected that the Internet will connect an ever-greater number of users, objects and critical information infrastructures. The role of the Internet as a social and economic infrastructure is deepening. With this, the Internet needs to meet social needs placed upon it, expand opportunities for innovation and economic growth, be robust and secure, and scale to evolving requirements. In what follows, the main points are categorised by leading themes discussed at the workshop.





Lack of accountability was stressed as the biggest impediment to improved safety, scalability, sustainability and stewardship on the Internet. This lack of accountability was linked to the lack of market transparency on the Internet.


Lack of transparency of the Internet infrastructure can lead to investment inefficiencies

While network operators are fierce competitors, the nature of the Internet dictates that they must also co-operate, and there is a need for technical co-ordination to ensure efficient inter-networking. Several participants stressed that the lack of information in the public domain in the telecommunication and Internet sectors led to large inefficiencies in these markets during the “Internet bubble”. For example, in the late 1990s, network operators invested in capacity far ahead of demand, and largely on the same routes, while there was a lack of capacity on other routes.

If markets had been better informed of factors such as the growth and nature of traffic, better decisions and outcomes could have been possible. A lack of such information can be a barrier or deterrent to investors, whether they are private sector operators, governments, aid agencies, or others who may wish to invest in networks. The importance of preserving the confidentiality of commercial information for efficiency and security reasons was noted. However, participants agreed that greater amounts of data at an aggregated level should be available to researchers, capital markets, policy makers and so forth. There was also a call to identify more precisely what information is required from network operators.


Transparency and disclosure of hardware and software functions and of service providers’ policies could help improve trust/ trustworthiness

Several participants stated that many privacy and security issues could be addressed through greater transparency about what tasks software and hardware components are undertaking and why. While there are currently many different practices with regards to disclosure, clear standards could specify what information should be disclosed and how.

In the area of software, some participants stressed that all the functions performed by software should be specified at the outset, in a similar way to that of food labelling.5> Understanding what functions are undertaken by installed software is essential so that a user can understand what a computer is doing when connected to the Internet and whether this activity is warranted. This is, however, not without challenges. Malware, for example, is often difficult to differentiate from legitimate software in terms of how it uses the functionality of a user’s equipment.

In addition, it was suggested that companies’ retention policies should be made clear in a consistent manner, so as to allow users to compare service offerings. It was felt that such disclosure of policies might help users to make better-informed decisions on products and services. Participants gave examples of ISPs recording certain types of information for business or legal reasons. They said the same applies to other types of service providers, such as free web mail accounts.

Measurement of the Internet

"If you can't measure it, you can't manage it"
Internet operators 6

Many participants strongly felt that a greater amount of information should be available in the public domain, to support efficient inter-networking. This could include, for example, information that would provide a better understanding of the Internet’s points of vulnerability or recovery after network failure. The question of the impact of recent earthquakes in Asia, that severed a number of undersea cables, was raised, and in particular how the event affected communications, how the network recovered, and what the economic and social impacts in the countries affected were. Participants reiterated that there needed to be a clearer delineation of what information should be shared in the public domain, while noting the need to protect commercial confidentiality. Researchers noted that identifying the information to which they need to have access first requires articulating the questions to be addressed.


Proposal for measuring the current Internet

A comment was made that much of the information about the Internet that is now in the public domain was not the result of a transparent methodology, repeatable by computer scientists and other network researchers. A representative from the Packet Clearing House (PCH) raised the possibility of introducing standardised measurement practices for Internet eXchange Points (IXPs), and for gathering sample IP traffic information that could provide a better idea of aggregate traffic flows.7 8 Matrixes of IXP participation and sample IP traffic information could be correlated with two additional datasets that PCH collects: topology of the network that can be observed and relationships that can be inferred. It was stated that with this data, one can further begin to extrapolate the volume of Internet traffic being generated in areas of the Internet that are not under the purview of IXPs. Participants at the workshop were generally supportive of having greater capabilities for sharing aggregate inter-networking information, both at present and for future technological developments.


End-point computers as potential data gathering points

The idea was put forward of having end-point machines themselves acting as “data points” for tracking the overall condition of a given PC and more data-gathering about the Internet. Here the idea would be that end-points could signal to other end-points and possibly provide information to some central “public interest aggregator”. Such information might include the number of machines connected to the Internet and their status, e.g. if they report being free from infections by known viruses or “bots”. In this way, it was suggested that they could provide information to other users in real time, with or without intermediaries.


Measuring the economic and social value of new content being created on the participative web

Measuring the economic or social value of new content being created on the Internet was considered as consistent with some of the OECD’s core expertise. On the economic side, the notion of complementarities was highlighted as being important, with many goods being created to “complement” others. 9 While transaction-based markets are the basis for many economic statistics, the fact that the Internet is a virtual marketplace implies that transactions can be for profit, for free, or for a barter arrangement and not necessarily quantifiable. With most interactions taking the form of complex barter arrangements, it is considered difficult to measure such an environment, unless proxies can be found, such as comparing the value of Open Office with the value of Microsoft Office.





Participants stated that users will need to better understand the degree of personal risk they face in online environments. Increasing user understanding was identified as an area which could be the subject of governmental agreement, around the three ways to mitigate current abuses of trust: i) technical prevention, ii) post-hoc law enforcement, and iii) education to improve awareness.


The difficulty of security and options for a more secure future


The inherent difficulty of security and manageability


“Currently we have a security lumberyard, rather than a security architecture”
David Clark

Participants noted that the technical community and other stakeholders have been trying to improve security for more than two decades but have had little overall success, due in part to the little consideration that was given to security in the original design of the Internet. Participants also noted that while the Internet successfully transfers data, such data is very difficult to manage for both network providers as well as individual users.


Most problems lie in the end nodes

Participants attributed security problems to social and economic factors, as well as technical ones, highlighting the need to focus on the end-nodes, which are not managed by network operators.


Many security problems involve a balance of interests among actors, states and societies

It was noted that the security issue is made more difficult by the lack of agreement among actors on what “Internet security” should be and that, as a result, the objectives of “classic” security, availability, and resilience, must be defined broadly and imperfectly. While there is a need for a more coherent security architecture, it was pointed out that different approaches might be required for different contexts; therefore coherent security need not mean homogeneous security.


Trust and tools to enhance trust

The discussion pointed out that the lack of a trusted online environment is detrimental to most of the economic and social goals that are attached to the current and future Internet. While total trust may not be fully achievable through technical design, it is necessary to impose constraints on Internet transactions so that they are “safe enough” for people to want to use the Internet. In addition, to adequately determine what constitutes “safe enough”, the level of security of transactions must be measurable.


The need to tread lightly in designing digital identity mechanisms

Participants noted that trust is closely related to the reliability of identity mechanisms. They cautioned that such mechanisms should not be allowed to “overshoot” their purpose and place control in the hands of actors that the users may not trust in another context. An example given by a participant would be to equate identity with a credit card. Not only do many Internet users not own a credit card, but credit card identification would remove some anonymity from the Internet.

Several participants expressed concern over the proliferation of several identity management systems and their adoption by universities. They stressed their view that the privacy implications of these systems had not been thought through and that, in addition, considering the benefits and risks of identity management systems, given their cross-border nature, requires international co-operation.


Social trends have repercussions on security

Several social trends with repercussions on security were noted at the workshop by participants:

  • An increasing amount of information is online, whether for individuals, corporations, or governments; this includes information about critical information infrastructure.
  • Much of that online information is under third party control (e.g. ISPs provide e-mail; application service providers manage and store business information).
  • Criminals are increasingly using the Internet: hacking has evolved from being a “hobby” for amateurs to a criminal activity; identity theft is increasing, along with other phenomenon such as spam and incidences of Denial of Service (DoS)-based extortion (though statistical sources do not always agree on magnitudes). One researcher raised the question of which statistical sources should be considered legitimate by policymakers in this field.
  • Equipment attached to the network is increasingly untrustworthy for both users and the parties with which they communicate or interact: fraud attacks on home PCs through trojans, spyware, etc. render the network layers of authentication useless. As service providers such as banks increase their security, more attacks on the end-points, or from the end-points, are expected. The end-user is becoming the “attacker”; unknowingly acting on behalf of other entities that break into his/her machine to execute malicious code, e.g. botnets.
  • Regulatory oversight and standards are increasing: regulations, such as Sarbanes Oxley in the United States, the European Union’s Data Retention and Data Protection Directives, or corporate standards, are increasingly determining how security is to be implemented.


Globalising nature of the Internet and jurisdictional arbitrage

Globalisation was identified by participants as presenting a major security challenge, due to the increasing number of criminals that engage in jurisdictional arbitrage. Criminals frequently operate from countries that lack adequate legal mechanisms to deal with computer crime or enforcement of laws. Where such laws do exist there may not be enforceable extradition treaties. Participants noted that whereas laws and law enforcement are based on proximity, criminals are able to operate remotely and, in addition, frequently route through several third-party countries. Participants urged that a range of protection mechanisms against such activity be devised, recognising that any such protection will have to continuously evolve, as criminals’ strategies are also adapting to whatever new protection is built into the system.


Security versus freedom at the edges

The question was raised of how to enhance network security while enabling access to the information that users demand and allowing the serendipity that leads to unexpected innovation at “the edges”. Several technical ideas for enhanced network security were suggested:

  • Diffusion of traffic rather than traffic optimisation, as a strategy for making the Internet more secure. The point was made that Denial of Service (DoS) attacks, which involve concentrating traffic from many systems onto one system with the intent of flooding it, are made possible because the Internet assumes a conversation between a given source and a single named destination. Therefore a solution that was suggested might be for the destination machine to connect to a large number of “bodyguards”, so as to diffuse the attack. An example given for such a “diffusion” approach is the Akamai web-caching service, where many copies of the content are made so as to resist “flash crowds”.
  • Shift focus to security of information, not connections. It was noted that an approach where the content of a web page could be authenticated independently of the connection over which it came, could provide a better security than authenticating the connection itself (e.g. via SSL) and it would allow easier strategies for dissemination of information (such as in publish-subscribe systems, peer-to-peer systems or disaster fall-back mechanisms). In essence, it would decouple securing a connection from ensuring legitimacy of content.
  • Use virtual machines to isolate activities. Another security approach mentioned would be to use virtual machines, separating a “RED” zone with an open default mode for experimentation with new or otherwise suspect code, from a “GREEN” zone of activity for important activities such as online banking or business, open only to trusted parties.
  • Trust-modulated transparency. In this proposed trust-modulated transparency approach, users could choose who they are willing to interact with, e.g. based on third parties or regulation, public opinion, the level of transparency they offer to other users the traffic-routing structure would estimate the trustworthiness of packets of data and set aside suspect packets for further screening.


Harness competition and market forces to increase security

“Security is made up of trade-offs”
Bruce Schneier

Participants agreed that security is an economics problem, rather than just a technological problem. According to a leading expert on security at the workshop, the security of software (software design, quality of feature implementation, adequate testing and vulnerability testing) is a fundamental factor in the overall security of information systems. Several leading computer scientists at the meeting stressed it was next to impossible to develop code for programs sometimes having millions of lines without bugs and imperfections. Their message was that we will live with this in the foreseeable future and that consequently there was a need to better equip end-users to deal with security. In one participant’s opinion, “software monopolies” make the problem of insecure software worse by limiting product choice or creating “lock-in” effects.

Some participants felt that, as with other externalities, imposing shared liability on different market actors along the “security value chain” could usefully align incentives to increase security, while others pointed to the difficulty for regulators to shift liabilities and the potential to create significant distortions.


Privacy implications of the Internet and other distributed systems

The privacy implications for society of the Internet and other distributed systems were the subject of much discussion at the workshop.


Social networking sites

A question raised was how to preserve individuals’ privacy while allowing them to benefit from services and devices that tailor information to the individual or allow them to participate in social networks. Participants noted that when people interact with online services, they often lose anonymity, as service providers use data mining techniques, profiling, correlation and linking across online social networks to better identify users. Participants brought up the fact that, in addition, the information that users voluntarily place online can be more readily searched and categorised.


The role of profiling and advertising

Participants raised the question of the role of advertising in the erosion of privacy. While Internet users benefit from systems that derive revenues from advertising, some participants expressed concern that individuals may not realise the large amounts of information gathered about them for profiling and other commercial purposes, and the potential duration of retention of this information.


Geolocation technologies

Geolocation 10 technologies that use unique identifiers bound to devices (such as a mobile phone number) and/or associated with presence on a network (such as with instant messaging technologies), were also pointed out by participants as raising a number of privacy and/or freedom of expression issues. Several participants stated that increasingly, individuals would struggle with the control of and access by third parties to their location and presence information.

Some participants pointed out that geolocation can provide tools for advertisers to target messages according to physical location. It was also suggested that some governments might use such tools to exert greater control over online behaviour within their borders. A participant gave the example of IP address location identification tools, which were becoming increasingly accurate. It was noted that IP address identification was increasingly used for commercial purposes, for research and statistics, as well as in legal-jurisdictional realms, with potentially large social impacts. Others noted that location assessment by IP addresses was still an inexact technology.


Wireless and sensor networks

It was noted that technological developments such as Radio Frequency Identification (RFID), which hold great potential for economic efficiency and social convenience, bring new privacy and security challenges associated with “ubiquitous communication”.

The question of whether more thought needs to be given to governance structures dealing with RFID devices acquiring sensing and networking capabilities was also raised at the workshop. More specifically, the question was raised in the context of any potential Object Naming Service 11 associating RFID-tagged objects identified with a unique Electronic Product Code (EPC) and then tying it to Internet-based information.


Applicability of OECD Privacy Guidelines

While the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (commonly known as the OECD Privacy Guidelines) were considered applicable to wireless and sensor networks, several participants felt that several areas should be reviewed. The notion of consent, as envisioned in the OECD Privacy Guidelines, may not be robust enough in the new environment. The example was given of new situations where privacy problems do not relate to governments or businesses gathering data. Instead, these situations were viewed as peer-to-peer privacy invasion: e.g. online videos showing footage of members of the general public without their consent, or services enabling users to quickly post celebrity sightings using online mapping services to pinpoint their location.

Participants suggested that when data gathering is conducted by private individuals or by sensor network environments, with no central authority, it is harder to deal with in respect to the OECD Privacy Guidelines than situations involving a corporation, government or other “data collector”. In part they suggested this was because “accountability” is much reduced on the Internet.
At the same time, it was pointed out that it is often difficult to identify a responsible individual, and even when a responsible individual may be identified, trying to apply “old” standards of transactional responsibility might actually lead to the suppression of new valuable forms of social interaction. In this respect, the example of who should be held responsible in the case of citizen journalism web sites, where most of the content is user-generated, was noted.

An important and related question was raised regarding values, and whether younger generations participating in voluntary social networking sites placed different values on what was applied in respect to their privacy.

A debate ensued at the workshop as to whether privacy issues could still be partially addressed by, for example, including information in the metadata accompanying a picture, video, or other digital information, which identifies its origin and allows for contacting the author ex-post. In a similar way, it was pointed out that technologies may be able to help to transmit individuals’ privacy preferences ex-ante, e.g. through cellular phones, specifying browser preferences, and so forth.

The discussion stressed that the OECD Privacy Guidelines could be a model to build upon, as they have helped facilitate transborder data flows between countries by protecting privacy, being both technology-neutral and culturally-neutral, and not specifying implementation mechanisms. In particular, their level of abstraction and neutrality could be used as models. Moreover, if there was general agreement about appropriate behaviour, it could potentially be codified in a similar way. The example of standardised labelling (e.g. software labels or service labels) was given as a potentially powerful tool.


Consumer empowerment and disclosure

User empowerment was frequently mentioned by participants as an important feature of today’s Internet. They indicated that it should be retained in any new design, as it is considered a very important and positive social side effect of today’s Internet.
It was suggested that certain types of software should be subject to a labelling regime, and that some software functionality such as that which logs keystrokes (e.g. when a user enters a password), should be disclosed to the user. Some felt that research, possibly by the NSF, could be conducted on technical tools to disclose software functionality in order to increase awareness for users. Some also raised the idea of new OECD guidelines to specify labelling regimes and disclosure.

Participants re-iterated that one way forward could be better educating users to use separate “virtual machines” on their computers for transactions, according to their acceptable level of risk. One way would be to use a “safe” machine, professionally managed, for activities such as online banking or income tax and a less safe machine for less important or “fun” activities, with some kind of a “reset button” function in case of infections. It was noted that software using this concept already exists.




The concept of participative networks envisages an Internet increasingly influenced by intelligent web services. These would be based on new technologies that enable users to contribute to developing, rating, collaborating and distributing Internet content and customising Internet applications, driving a range of new social and economic opportunities alongside new models of production.

Participants at the workshop agreed that, in looking to 2020, measures chosen now will have a significant effect on the participative Internet that potentially brings together features such as citizen journalism, artistic/cultural creation, or user ratings. They noted that the significance of participative networking was clear in that never before had an infrastructure allowed so many people to introduce so many kinds of content, on such a broad scale, and potentially with such wide-ranging impacts.

Participants also extended the concept of participative networking beyond individuals to include enterprises with collaborative Internet-based business models as well as grids that inter-connect personal computers and other devices such as cellular phones and networks, for example, sensor networks.

On the subject of participative networks, discussion covered i) openness to and participation of the general public and businesses; ii) participation of all, including rural regions and developing countries and; iii) openness and participation in relation to intellectual property rights enforcement and digital rights management systems.


Openness and participation

The Internet has demonstrated a clear capacity to foster competition and innovation at the edges of the network. Openness was viewed as enabling creativity, collaboration, innovation, and increasing economic competitiveness. Additionally, it was observed that the pace of such innovation is increasing and it may originate from any part of the world. The importance of interoperability was stressed as it enables the connection of large amounts of heterogeneous machines and networks, and the furthering of an environment of innovation and cross-fertilisation.

Participants agreed that the effects of transaction-free online environments run deep in societies and economies. On both the supply and demand sides, the use of services around social networking, that provide value at little or no (direct) cost, have proliferated. Facilitated by low barriers to participation, new models of commercial and non-commercial collaborative work have emerged. Illustrations given include the development of Wikipedia, the user-created encyclopaedia, or the development of open source software, both of which aim to harness the “collective intelligence” of Internet users in ways unimaginable just 30 years ago. Other examples of “Web 2.0” raised at the workshop included open Application Programming Interfaces (API), mash-ups merging several services, such as online maps and location data. 12

It was pointed out that most of the protocols at the core of the network are based on open standards where protocol specifications are freely available to anyone, thus considerably reducing barriers to entry, and enabling interoperability. The concept of “openness” also encapsulates the notion that the “devices” connected to the network are amenable to new service applications, such as the personal computer or other general purpose programmable devices. The assumption is also made, it was suggested, that service applications are based on open specifications and that there are multiple sources of supply.

The importance of the end-points as generative platforms

The importance of having open innovative computer platforms at the core of the Internet ecosystem was raised. These open platforms were contrasted with “closed devices” or “tethered appliances”, such as video recording hardware or cellular phones. With respect to “closed devices”, the program code and communication components are not accessible to third-party innovation but, rather, are tied to a particular vendor or service provider. 13 Used in large numbers, “closed devices” could change the Internet into a much more structured and ossified ecosystem.


Grids, collaborative business models and “Enterprise 2.0”

Participants noted that beyond individuals, businesses were increasingly using Internet-based collaborative processes, which decreased friction in international business transactions. Some termed this economic and business trend “Enterprise 2.0”, and highlighted its significance in raising standards of living, wealth creation and competitiveness in global markets. 14

Participants also highlighted the increasing role of grid computing, i.e. federated distributed computing power. 15 Several participants underscored the evolution underway, from single-domain computers, to platform grids, at the edge of the Internet. Some took the view that these grids, of various sizes and shapes (e.g. home grids or office grids), would host applications, and that they could be foreseen to raise new challenges, such as security. 16


Participation of all, including the developing world

Participants linked inclusion themes with participative networks and said there was a need to think about accessibility for the future of the Internet. Policy makers at the workshop repeatedly referred to the importance of including the developing world in discussions on the future of the Internet and making its use more accessible and affordable. There was recognition of the increasing number of Internet users that are from outside the OECD area. An issue highlighted in particular was the integration into the Internet of cellular phones, which are set to have billions of users, including large numbers in the developing world.

Participants agreed on the continuing need for ICT-related capacity building in less-developed countries. While OECD expertise was viewed as potentially contributing significantly to this capacity-building exercise, the recurrent difficulty of mobilising support across the international community for ICT projects was recalled. Therefore, the need to engage institutions with funds for capacity building in ICTs, e.g. the IMF or the World Bank, was highlighted.


Openness and content production, distribution and access


Intellectual property rights

Participants recognised that the Internet makes it relatively easy to distribute and replicate digital content. A number of participants said there was a clear need to reward creativity, protect intellectual property rights, while achieving a balance with “fair use”. How best to ensure the protection of intellectual property rights while encouraging an open environment was debated at the workshop. Several participants noted that new business models were emerging around the provision of content, including use by traditional players.

Some participants stated their belief that greater emphasis in copyright law on exceptions, “fair use”, “user rights”, as well as the ability to access works of cultural heritage, could be important to balance incentives for content creation versus fostering content re-use, access and distribution.

Some participants pointed out that governments either control or play a significant role in the creation of large amounts of publicly funded content, research and information that could provide raw materials for many activities. They advocated facilitating access and commercial re-use of public sector content and information.


Digital Rights Management (DRM)

For some, technological protection measures (e.g. DRM) represent an important tool to ensure that rights holders are rewarded for the creation and distribution of content. An alternative view expressed by several workshop participants was that technological protection measures such as DRM were not always in the interest of consumers and that caution should be exercised regarding potentially embedding rules for today’s business models into technology i.e. “locking in the present or the past”.

Values, creativity and culture


Common values

Participants stressed the need to fully comprehend the “global infrastructure” nature of the Internet, and the accompanying pressures placed on both users and governments. Participants noted that discussions on the future of the Internet were complicated by the different sets of values held worldwide, in that, in their view, all economic, social and technical issues are underpinned by values.

It was generally agreed that, due to the global nature of the Internet, there was an important need for articulating possible common principles that could stand the test of time. Some possible common principles put forward included: i) a shared desire to benefit from economic growth and economic development; ii) the critical importance of ICT infrastructure to further economic development; iii) widespread support of the efficiency of using markets; iv) the need to foster innovation and creativity; v) the general principle of a single interoperable Internet enabling all Internet users to potentially access any other Internet user, and vi) the need to foster security and stability to enable trust. Noting the OECD’s history of establishing principles and guidelines that have cross border applicability, several participants suggested that the OECD could develop work in this area in the context of its June 2008 Ministerial meeting on “The Future of the Internet Economy”.


Rapidly evolving values

The current concept of “ubiquitous networked society” was noted as a particular focus in East Asia – China, Korea, Chinese Taipei, and Japan. This concept envisages digital network utilisation anytime, anywhere, for anything and by anybody. Participants explained that the overall concept of “ubiquitous networked society” allows for national interpretation and variations. 17

It was highlighted that, in terms of the Internet, younger generations are growing up in a world that is greatly different from the communication environment experienced by their parents. One example cited was that many younger people spend significant time online using avatars or haptic interfaces in virtual reality spaces.

Several participants pointed to the fact that the development of IT may lead to the weakening of “collectivism” to the benefit of individual consumers. In this sense, participants likened the changes afoot to new patterns of organisation that speak to values of individuality and self-expression. For example, social media production offers potential improvements to the quality of our societies in terms of fundamental values such as freedom, democracy and equality. Participants also stressed their belief that society would be increasingly iconic, i.e. with visual images and moving features as dominant formats for communicating.

Several participants were of the view that I.T. applications involving the human body would have major cultural ramifications. Examples provided include digitalisation of the human body through “wearable information technology”, and interaction of humans with cyberspace in the area of robotics.





“For many economists, the Internet represents the Big Bang of Cosmology”
Dennis Weller

Several participants mentioned the role of the OECD in promoting international recognition of the importance of competition and innovation for economic growth. Discussion focused on the increasing pace of technological change and on the investment community’s point of view. On the one hand, it was noted, technological change and innovation create new opportunities for wealth creation and jobs. At the same time, the process can be destructive towards existing business models and firms that do not adjust to the new environment. This led participants to raise the question of who would fund long-term investment in communications infrastructure. A number of participants said that in this new environment governments should remain technology-neutral and avoid creating market distortions. Others felt there was a new potential role for public-private partnerships.

The need to better understand the process of economic change brought about by the Internet, in which disruption occurs for some players while opportunities arise for others, was stressed. Participants also pointed out the need to understand the influence of different factors, such as the role of open standards, open and documented interfaces and open source in facilitating competition and innovation.

Table of contents

Minimalist regulation approach versus critical infrastructure status

The issue of how to continue to apply a light regulatory approach, which uses the decentralised nature of the network and self regulation or best practice approaches, to an infrastructure that is increasingly critical to economic and social welfare was discussed at the workshop.

Participants highlighted that from a public policy perspective, the challenge was to structure an environment where market forces and competitive pressures lead to outcomes that are desirable, in terms of public policies and national and regional goals. They were also conscious of the important role that co-operation and co-ordination play in dealing with some of the Internet’s most pressing problems.

Globalisation and societal needs

Globalisation and democratisation of global supply and distribution processes were viewed as two key socio-economic trends that were facilitated by the Internet.

Participants agreed that a successful “Future Internet” offered the promise of improved communication for increasing productivity, better research, health care, education opportunities, and entertainment and social lives, as well as accelerating the growth of scientific knowledge in areas such as biotechnology and environmental management.

The major social and economic challenges that were highlighted by participants included demographic changes. In particular, ageing in the developed world, as life expectancy increases, will generate new requirements for health and security as well as simplified usability of information technology, privacy and security. Managing individuals’ personal online experience was viewed as being increasingly important, with consequences in the management of personal information and provision of access to ubiquitous health care or provision of telepresence applications. 18

It was also highlighted that networks will have to accommodate the increasing concern for energy efficiency and sustainability and that economic systems and societies (in particular, social norms, ethical values and laws) are in turn being drastically changed by ICT technologies.

Economic or social externalities

Participants pointed out that if economic incentives were better aligned with the externalities – through contracting, regulation or liability – technological solutions would follow. For example, making equipment and service providers more responsible for security could push them to develop cost-effective ways to provide a more trusted online environment. It was stated that currently network owners/end-users bear the whole burden of information insecurity. 19

However, it was noted that shifting liabilities may not constitute a solution, because the complexity and number of players involved make it difficult to align interests efficiently in respect to the Internet. Furthermore, it was expressed that introducing additional liabilities presents the risk of slowing down innovation.

Three areas of networking were specifically highlighted in relation to the role of the market and possible imperfections:

  • Tragedy of the commons”, where a common space available for use by all, without immediate incremental penalty, leads to the destruction of the resource: a case in point is routing where the cost of placing a load on the routing system is not incrementally borne by the originator.
  • Long-term social or community needs: these may not be adequately addressed by markets, when they are driven by short-term imperatives: an example is that of the upgrade to IPv6 which incurs costs and no revenues in the short term, even though it offers advantages over the long term including openness to innovation.
  • Risks/costs of disruption, which lead to higher capital costs: an example given was that of the telecommunications industry where, in the last decade, the telephone system has been transformed and replaced by a much cheaper and more efficient system, but where investment carries a higher risk than in the past in terms of stable and predictable returns.


Regulation and multi-stakeholder co-operation

The point was made that, as markets become more competitive, sector-specific regulation may be less relevant as opposed to general competition law and consumer protection regulation.

Discussion took place on the best form of regulation, particularly the need for hybrid options and multi-stakeholder co-operation. In that context, it was pointed out that today’s participatory nature of the Internet was greatly enabled by sector specific laws that were passed a decade ago in respect to openness (e.g. liberalisation and regulatory safeguards where there was insufficient competition).


Public private partnerships

The point was made that it is preferable for governments to treat some issues as contracting/procurement problems rather than regulatory problems, thereby increasing government efficiency and protecting markets from unpredictable regulation. Public-private partnerships in Canada and Sweden were highlighted to show that in some cases, where regulation runs counter to market interests, it can be more efficient for governments to operate within the marketplace than to focus on regulation to gain compliance.

A reference was made to Sweden. It was reported that having a compelling national defence interest in the stability of its Internet infrastructure, the Swedish government pays the excess costs of moving some critical infrastructure underground (i.e. in a fortified cave) and running fibre optic cables back into the rest of the Internet infrastructure in that country. However, forcing ISPs by means of a law to place some of their equipment in a more sheltered environment could generate non-compliance costs for the government, which would exceed the costs of a more direct economic intervention.

Another case mentioned was that of the Canadian government. The Task Force on Spam created in 2004 became a catalyst for combined private and public sector initiatives, which addressed aspects of the spam threat, such as the creation of voluntary guidelines for network management. 20


The role of open interfaces

Participants suggested that the Internet technological interfaces designed in the 1970s shaped the landscape of investment and competition. They said open interfaces determined industry structures, as open standards interfaces lower the transaction costs and allow different parts of the market to be addressed separately. For instance, the technical decision to create an open interface separating TCP from IP enabled IP applications that did not use TCP to be developed. In contrast, some believe the decision not to develop an open interface for routing and forwarding has meant that packet routing and forwarding functions have been combined in one single box with a less competitive outcome. Others believe that the fusion of routing and forwarding is what allowed the Internet to be deployed as a separate overlay over diverse telecommunication facilities.

It was further argued that although technical design decisions made in the past determined the types of business models that developed, these decisions were not made with any particular business model in mind. Some participants felt that this approach, protecting the serendipity of the Internet, should continue to the extent possible. The past and current leadership role of the NSF in enabling all points of view to be represented was highlighted. Participants stressed the continuing significance of this role in the design of the future Internet.


Investment in infrastructure


Risks/cost of disruption

The point was made that infrastructure investments, as long-term investments, generate strong resistance to further change.21 Several participants felt that the integration of billions of inexpensive devices on the next generation of the Internet could be disruptive to ISPs, for whom an undesirable scenario was to service a network according to the most demanding application while their charges are based on the provision of access.

Participants agreed that the increasing numbers of computing devices in businesses and homes, and in the environment, would lead to the connection of these many devices to applications over the Internet. It was also noted that the requirements for ubiquitous sensors, where the main concern is cost rather than performance, are different from those of computers. Participants felt that the Internet of the future would have to support wireless more effectively, in particular: mobility and ubiquitous access (with issues of rapid reconfiguration when roaming, for example), security, identity and variable performance.

Some participants felt that disruption was unavoidable because of the need to cut down the cost of networking per packet by orders of magnitude, recognising that such a drop in communications could, as was the case 10 years ago, lead to a boom/bust cycle and the displacement of some players in the current market.


Servicing the last mile

The “last mile” was identified as a key issue at the workshop. While optical networking is ensuring that there are few issues of capacity in the Internet for well-served backbone routes, substantial new investment will be required to extend capacity of individual consumer premises. It was pointed out that this would be necessary to accommodate advanced services, such as video over the Internet, which are expected to require an ever-increasing amount of bandwidth.

It was suggested that, whether the next generation networks are federated or not, higher speed optical and wireless networks will be needed, along with new business models and new arrangements which could coexist, such as:

  • Business models based on exclusive services: Some telecommunications carriers believe that they need vertical integration in order to attract investment in infrastructure, such as fibre-to-the-home, so they can benefit from the lower costs of IP-technology while retaining control over the provision of services. Such approaches call for the use of technologies such as IP Multimedia Subsystem (IMS) and for putting functions in the core of the network. In such a network, some services require an explicit partnership with the regional or national network operator. Critics pointed out that this approach would lead to a vastly more complicated network and that the levers of control would be less user-centric.
  • Separation between infrastructure providers and service providers: some participants pointed to recent research claims that it would be in shareholders’ best interest, in terms of net returns, that carriers separate structurally. Such action would decouple the two roles currently played by ISPs, i.e. managing network infrastructure and providing services to end-users. Infrastructure providers would manage the physical infrastructure, while service providers would deploy network protocols and offer end-to-end services. Examples were given where municipal networks and condominium networks have been explored.
  • Different infrastructure arrangements and business models could compete and co-exist.


Financial markets

It was felt that the issue of potential disruption of financial markets after a catastrophic event such as 9/11 should, by itself, be an area of focus (in particular to make sure that robust and reliable communication remains available immediately after such an event).


Convergence and Internet traffic exchange

A continuing theme throughout the workshop was how best to stimulate “creative destruction” and innovation in areas like telephony, broadcasting, and content, while at the same time creating an environment that supports investment in Internet infrastructure and services.

Two key features of today’s Internet industry structure were mentioned: i) heterogeneity, instability, and rapid change: stages of the value chain merge and separate, making regulation difficult, and ii) facilities-based Internet providers, while playing a vital role, may be poorly integrated into the value chain.

Three types of convergence were identified:

  • Horizontal convergence across existing platforms, such as cable, telecommunication or wireless.
  • Vertical convergence: changing the roles along the value chain e.g. for video delivery.
  • Convergence on the Internet, which is becoming the single platform.

Several factors were identified as slowing convergence: existing regulatory paradigms built around each platform, private parties’ existing property rights and the need to address legitimate social needs ahead of technological readiness (e.g. the necessity for reliable emergency calls).

It was argued that greater understanding of how networks exchange traffic and greater underlining of the importance of commercial negotiations for interconnection are required to assist policy makers in the transition to converged networks. In this new situation, inter-networking relationships are no longer confined to a relatively small group of homogenous telecommunication carriers but include a diverse set of carriage, service and content providers as well as the wider business community. 22

For traffic exchange among new networks, participants noted that the market approach admits considerable efficiencies and that the Internet’s peering framework, based on a market approach to interconnection, can be considered to be a good example of the cost and efficiency benefits of transition from close regulation to greater reliance on markets.

Identification and addressing

Participants at the workshop were reminded that the Internet’s address architecture represents a collection of design decisions, or trade-offs, between various forms of conflicting requirements. For example, IP addresses identify the desired end-point for the packet’s delivery, which has implications in terms of the design of the associated routing system. 23

The issue was raised of potential harmful outcomes associated with any early exhaustion of IPv4 addresses. It was suggested that the OECD look at the economic aspects of this issue.

Participants pointed out that, in the world of IPv4, an IP address is increasingly being seen as a “locality token”, with a very weak association with some form of identity. It was suggested that the major reasons for the slow deployment of IPv6 lie in both economic (the cost is not viewed as justifying the investment) and public policy considerations, as much as in considerations of the underlying technology. To improve the efficiency of inter-networking and face the challenges of addressing new devices and objects, it was suggested to investigate the concept of dividing an address into distinct realms, separating, at a minimum, end-point identity and network location.

It was suggested that Network Address Translators (NATs) could represent a potential solution, mediating identity and separating addresses from identity on the network. The point was also made that for many applications, such as traditional client-server based applications or peer-to-peer applications, there was no incremental cost of operating through NATs, and that for VoIP applications the major issue was not NATs per se, but the fact that they have not been standardised. Therefore, standardising NATs was proposed as a possible way forward.





It was suggested that many political, economic and regulatory issues that can be associated with the Internet reflect the catalytic role the Internet has played in increasing the global and interdependent nature of the world economy. Many ongoing public policy issues need to be reviewed in the context of Internet developments. There was a consensus that they should not necessarily be viewed as “Internet” issues. The importance of clearly differentiating policy issues that were solely Internet from issues that may be affected by the Internet was emphasised, in view of the risks of politicisation. Some participants felt the danger of politicisation carried accompanying risks of loss of interoperability at the global level.

There was a general acceptance of the multi-stakeholder model, as a legacy of the World Summit on the Information Society (WSIS) and widespread recognition that governments are no longer the only problem-solvers but need to co-operate with other stakeholders.


Role of governments

Participants stressed the need to continue encouraging common definitions for, and mutual respect of, the roles of private and public sectors.

An important role of governments was highlighted in articulating and defining societal goals, as well as clear and consistent ground rules in the legal and taxation areas.

It was also noted that governments should develop common international laws with regard to fraud, abuse, e-commerce, and support the private sector’s efforts in these areas.

It was re-emphasised that to the extent possible, governments should strive to isolate “geo-political” aspects from the numerous social and economic factors shaping the future of the Internet. 24 This was viewed as important by several participants to enable the Internet to continue to enjoy the relative freedom that it has enjoyed to date and that many believe has fostered its development.

Several participants also pointed out that in the current environment, countries were highly inter-dependent and that development of the future of the Internet could be successful as an international collaborative effort. Participants also emphasised the necessary international co-ordination to protect the integrity of the Internet while facilitating e-commerce and global trade.25


Ground rules to foster an enabling environment

At the workshop several recommendations were put forward about ways for governments to foster an enabling environment:

  • Access/connectivity policies should promote access on fair terms and at competitive prices to all communities, irrespective of location, in order to realise the full benefits of broadband services. Where market-driven availability and diffusion of broadband services is insufficient, government initiatives should foster such availability.26
  • Regarding content, governments should improve access to public sector information, for example by introducing open access requirements for publicly funded research 27, by making available public sector content and information.
  • In the area of copyright and intellectual property rights, regulatory frameworks should balance the interests of suppliers and users, without disadvantaging innovative e-business models. 28


Role of technologists

It was pointed out that while today’s Internet co-evolved with the computer and not with cellular telephony, which developed independently, new network technology should support any upcoming computing and communications technology.

There was acknowledgement that it is impossible to anticipate all new technologies or all of their political, economic and social side-effects and that there are many different views of what will be the dominant computing technology 10 years hence: sensors, cell phones, embedded processors, USD 100 laptops, etc. or the predominant services. There was clear recognition that cellular phones are experiencing very high adoption rates in many regions of the world.


Clean slate approaches versus incrementalism

There was no consensus on the issue of “clean-slate” architecture versus incrementalism: certain participants believe that the future Internet or its replacement can be re-designed in a “clean-slate” manner i.e. potentially totally differently from today’s Internet, whereas others doubt that possibility, because of the existing large community of people, networks, and applications that are invested in the current Internet. The latter see an evolution based on either the current infrastructure or a GENI-like one, where imperfections are addressed along the way.

One of the issues highlighted in particular was the integration into the Internet of third-generation cellular phones, which are potentially set to have billions of users.


Co-operation of technologists and policy makers


Necessary dialogue between technologists and policy makers

It was noted that some technological choices make it easier to implement the access policies that are locally and globally defined, and foster appropriate accountability within the Internet. The need for a partnership between policy makers and technologists in this area as well as in other areas, such as security and privacy, was emphasised. 29


OECD Ministerial meeting in June 2008

Participants were reminded that the June 2008 Ministerial meeting on “The Future of the Internet Economy” will address many of the issues discussed at the workshop.

Given the large part of the global community which has not yet gained access to the benefits of the Internet, participants highlighted the need to give consideration to expanding Internet access in the Ministerial context and the engagement of key players. 30

The suggested features for the meeting’s outcomes were that they should i) express complex issues clearly for policy makers while highlighting best practice approaches and requirements for future work, ii) provide direction to the OECD about where it can best add value; and iii) be applicable in the larger global context.

Participants stressed their view that it was incumbent upon governments to articulate a high-level vision to be brought to the Ministerial meeting, informed by the multi-stakeholder context.

Participants further reaffirmed the importance of OECD inputs in economic analysis and measurement for evidence-based policy making.




The OECD appreciates the support and participation of the NSF that was the sponsor of the workshop on which this paper is based.

All participants are to be thanked for their expert, thoughtful, and stimulating contributions that provided a rich source of discussion at the workshop.

For the NSF, the objective of the workshop and this paper is to provide information for technical research planning for its Global Environment for Networking Innovations (GENI) facility. The GENI project aims to enable the research community to invent and demonstrate a global communications network and related services, which will be qualitatively better than today's Internet.

For the OECD’s ICCP Committee, the objective of this paper is to help inform its programmes of work on Internet-related policy issues, in view of an OECD Ministerial meeting on the “Future of the Internet Economy”, which is to take place in Seoul, Korea, 17-18 June 2008.




i) The openness of the Internet has been a clear factor in fostering competition and innovation, and is increasingly enabling users to develop, collaborate and distribute content and customise applications, driving a range of new social and economic opportunities:
- What do different people mean by “openness” of the Internet? Is openness the Internet’s key success factor?
- Is it possible to make sure the conditions are in place for innovation to continue to take place at the edges of the network and how can this be enhanced? Are there practices that hinder innovation at the edges?
- How can experimentation with new models for the economic use and creation of new digital content be encouraged?

ii) Security threats endanger network operation and a trusted online environment at a time when an increasing number and variety of communities and businesses critically rely on the Internet:
- Can a coherent layer for security be deployed in the Internet’s architecture to help address trust issues? How can incentives be aligned with the roles of different stakeholders to increase Internet security e.g. with technical standards, regulation and peer pressure?
- How can the privacy concerns of individuals and industry be balanced with the information requirements of the public sector and academia?
- How can we reconcile the need to share personal information with the need to safeguard individual rights, in particular the right to privacy and the protection of personal data?

iii) Convergence between previously distinct networks and services toward the use of TCP/IP generates new demands on the Internet and places strains on existing regulatory models:
- While business models are in flux and as previously distinct industries such as broadcasting and traditional telecommunications converge on the Internet, are there criteria that can help guide policy makers and researchers?
- How can we ensure there is sufficient investment to meet the network capacity demands of new applications and of an expanding base of users?
- Could the current commercial solutions used for Internet traffic exchange be used as a model for traffic exchange between convergent networks? How can naming and addressing be improved so as to improve efficiency of inter-networking?
- How can growth of the Internet be measured?

This paper summarises discussions at the workshop and an exercise in which participants were invited to rank 30 issues according to their urgency, complexity and importance (see Annex 2). Additionally, this paper draws from the position papers provided by participants (available at ).

For the ranking exercise, a list was developed prior to the meeting of several leading social, economic, regulatory and ethical considerations relating to the future of the Internet, which could be considered from public policy and/or technology perspectives. The purpose of this list was to provide a high-level view on the major trends affecting the “connected” world, and to begin to build a common understanding of the interaction of trends, potential options and priorities (Annex 2).

Each participant was asked to consider a small set of specific questions from the list in more detail (Annex 1). This provided an opportunity to reflect on today’s Internet, its achievements and any unintended consequences or outcomes of its existing design. Examples of questions addressed during this process were: Why are trust and identity such difficult concepts in relation to use of the Internet? Why has multilingualism been so hard in respect to Internet addresses? Was network manageability an afterthought? Why do some consider Quality of Service across different networks as a commercial failure? Could a different Internet model be conceived that would have led to better outcomes? 2




1 The position papers submitted by participants at the workshop are available at: (back to text)

2 (back to text)

3 An autonomous system (AS) is a collection of IP networks and routers under the control of one entity (or sometimes more) that presents a common routing policy to the Internet. (back to text)

4 Marjory Blumenthal, presentation at OECD, June 2006, “Wither the Internet?”. (back to text)

5 Simson Garfinkel, position paper, “The Pure Software Act: A Proposal for Mandatory Software Labeling”, (back to text)

6 Common maxim among commercial Internet operators, (back to text)

7 Netflow is a Cisco IOS software feature and also the name of an open (but proprietary) Cisco protocol for collecting IP traffic information. (back to text)

Bill Woodcock, position paper, “Instrumenting the Internet to Support the Development of Informed Policy”, (back to text)

9 See in particular, Shane Greenstein, “Data Constraints & the Internet Economy: Impressions & Imprecision”, (back to text)

10 Geolocation is the real-world geographic location of an Internet-connected computer, mobile device, or website visitor, based on an IP address, MAC address, hardware embedded article/production number, embedded software number, or other, perhaps self-disclosed, information. (back to text)

11 To locate authoritative metadata and services associated with a given Electronic Product Code (EPC). (back to text)

12 See in particular Brian Kahin, “How is the Internet Affecting the Relationship Between Social and Economic Activity?”, position paper, (back to text)

13 Jonathan Zittrain, position paper, “The Future of the Internet – And How to Stop It", (back to text)

14 Bill Saint Arnaud, position paper, “Most Significant Economic Challenge to the Future of the Internet”, (back to text)

15 Grids could also be defined as co-ordinating resources that are not subject to centralised control using standard, open, general-purpose protocols and interfaces, to deliver nontrivial qualities of service. “What is the Grid? A Three Point Checklist”, Ian Foster, 2002, (back to text)

16 See in particular, Lee Mcknight, position paper, “The Future of the Internet is not the Internet: Open Communications Policy and the Future Wireless Grid(s)”, (back to text)

17 See in particular, Inuk Chung, position paper, “Roles and Impacts of IT on new Social Norms, Ethical Values and Legal Frameworks in Shaping a Future Digital Society”, (back to text)

18 Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance that they were present, or to have an effect, at a location other than their true location.  (back to text)

19 Bruce Schneier, position paper, “Information Security and Externalities”, (back to text)

20 Richard Simpson, position paper, “The Role of Government in the Global Network Economy”, (back to text)

21 Geoff Huston, position paper, “Addressing as a Fundamental Part of the Internet”, (back to text)

22 OECD, 2006, Internet Traffic Exchange: Market Developments and Measurement of Growth (back to text)

23 See in particular, João Da Silva, position paper, “Converged Networks and Services”, and Geoff Huston, position paper, (back to text)

24 See in particular, Paul Twomey, position paper, “Effect of Multilingualism on the Internet, International Issues that Affect How Governments and Economies Address Issues Relating to a Global Infrastructure”, (back to text)

25 See in particular, Makoto Yokozawa, position paper, “How Different Governments and Economies Address Issues Relating to a Global Infrastructure”, and Michelle O’Neill, “The Shrinking Globe – How the Internet has Brought us Together ”, position paper, (back to text)

26 OECD, 2004, Recommendation of the OECD Council on Broadband Development, (back to text)

27 OECD, 2006, Public Sector Information And Content, and OECD, 2004, Ministerial Declaration on Access to research data, (back to text)

28 OECD, 2004, op-cit. (back to text)

29 See in particular, Leslie Daigle, position paper, “How Technology Evolution Can Be Partnered with Social, Policy and Regulatory Discussions”, (back to text)

30 See in particular, Markus Kummer, position paper, “Internet Governance and the Need for an Inclusive Multi-Stakeholder Dialogue”, . (back to text)









Related Documents