If we were to start from scratch today to design a quality-controlled archive and distribution system for scientific and technical information, it could take a very different form from what has evolved in the past decade from pre-existing print infrastructure. Ultimately, we might expect some form of global knowledge network for research communications. Over the next decade, there are many technical and non-technical issues to address along the way, everything from identifying optimal formats and protocols for rendering, indexing, linking, querying, accessing, mining, and transmitting the information, to identifying sociological, legal, financial, and political obstacles to realization of ideal systems. What near-term advances can we expect in automated classification systems, authoring tools, and next-generation document formats to facilitate efficient data mining and long-term archival stability? How will the information be authenticated and quality controlled? What differences should be expected in the realization of these systems for different scientific research fields? Can recent technological advances provide not only more efficient means of accessing and navigating the information, but also more cost-effective means of authentication and quality control? Relevant experiences from open electronic distribution of research materials in physics and related disciplines during the past decade are used to illuminate these questions, and some of their implications for proposals to improve the implementation of peer review are then discussed.