Friday, September 26, 2008

BPN 1232 Watch where you are going: EU cities differ sharply in internet performance

It is interesting to read that research findings like London is the best city to internet and in Milan, Madrid and Dublin you may struggle with gaming (come to Amsterdam!). It points to a kind of digital divide.

The findings are part of the report European City Internet Performance Index by the British research bureau Epitiro, which calls itself The Broadband Communications Authority. The bureau have monitored top internet service providers (ISP) since 2003 for the purpose of providing industry bodies with actual customer experience data of broadband service. The bureau started to expand its coverage by monitoring city, sub-urban and rural broadband performance, in both wired and wireless (3G) formats, across all Member States of the European Union. The question is of course why urban areas were selected for testing. But in short this boils down to the point that urban areas are the fountainheads of countries, with most activity going there. Besides, the performance of broadband benefits the social and economic structure of a country.

This is a report which offers the results of broadband performance in the major European cities Amsterdam, Dublin, Lisbon, London, Paris, Madrid, Milan and Zurich. It is a preliminary report, which will be replaced by the beginning of 2009 with a report showing the results of more cities. Most likely cities like Stockholm, Oslo, Helsinki, Copenhagen, Berlin, hopefully Munich and Vienna will be included. It is a pity that in this preliminary report the results of Stockholm are not taken in. In the broadband world Stockholm has been a city which started early with broadband and would provide a good measuring stick. But we will have to wait till January 2009.

The researchers wanted to gain insight into the likely performance levels of popular uses such as web surfing, VoIP, internet gaming and streaming video were the drivers behind the technical aspects measured. They had their own technical method for this, using a network of satellite devices that simulate typical residential computers by connecting to the internet and executing a series of test routines. Every 30 minutes the satellite devices connect to broadband providers and measure HTTP Download, DNS Resolution Time, Ping Time and Packet Loss whilst connecting to popular local and international web sites.

The dataset was based on over 2 million tests from July 2008 to September 2008 and the testing process remains active. This was during the holiday season, when the use of internet in the European cities is lower. Of course it might have given the researchers a secure start-up, but I like to see the results after the holiday period.

Still the key findings are remarkable:
• London has the best average internet service amongst the cities tested and also the fastest individual ISP;
• Amsterdam and Zurich also offer above average internet service;
• Dublin and Milan broadband service levels were the lowest of the European cities tested;
• Multinational ISPs that trade under the same brand name in different countries vary in performance as much as 44%.

The report concludes that there is a significant ‘digital divide’ amongst European cities in terms of broadband performance. Whilst all cities and ISPs can handle basic web browsing and email, the demands of VoIP and streaming media may not be reliably met by ISPs in some cities tested.

Blog Posting Number: 1232

Tags:

Thursday, September 25, 2008

BPN 1231 A tricky proposal by MEPs

The European Parliament has voted in favour of new legislation offering ISPs the opportunity to close off internet access as punishment for illegal downloading. Under the new rules providers can be obliged to terminate the connection of users who have downloaded music, games or video three times illegally. The measure, however, is not mandatory.

The new package of telecom measures offers EU member states the possibility to let ISPs punish their users; however, the measure is not mandatory. The measures have been accepted in the telecom package on the instigation of French members of the European Parliament, who are of the opinion that illegal downloading will harm the creative industry. ( A simular measure has been proposed by the French minister of Culture a few months ago). The usual argument of musicians missing revenues was brought up again. The EU-countries still have to agree with the telecom package. This will go back to the EU Parliament for a second reading.

The measure of disconnecting a user means that the ISP can start checking the content being downloaded manually or by filter. Theoretically (and in practice it already has been signalled) it is possible for a provider to check whether a user is downloading large copyrighted material, causing the network to sow down. However, some member of the EU Parliament (MEPs) find it unacceptable that ISP are promoted to police functions by making them responsible for the content of internet. Copyright should not take precedence over privacy.

The package of measures is supposed to improve the rights of the digital consumer. One of the measures concerns itself with the duration of the subscription to maximally two year. This should prevent subscribers from long and expensive subscriptions. Another measure obliges the ISP to offer parents software to guard their children from adult sites and chatrooms. Besides the internet users will be better informed about their rights and warned when there is interference with their internet use, causing privacy problems. The users would also be able to claim damages from the ISP for non-performance.

There were also non controversial measures like the statement that internet should be there for everyone, that tariffs and bills should be more transparent and that there should be a better protection of the data traffic, not only of public sites, but also of social sites.

Blog Posting Number: 1231

Tags: ,

Wednesday, September 24, 2008

BPN 1230 EC consultation on next generation broadband

Now that the fibre broadband projects, with speeds of 100Mb are introduced in several European countries, The European Commission is launching a consultation on the regulatory principles to be applied by EU Member States to Next Generation Access broadband networks (NGA). The EC is doing this at a time , when the first commercialisation of next generation broadband projects is taking place. In the Netherlands for example there are many commercial fibre optic projects in progress or even operational. The Netherlands is in the forefront after many a study for large municipalities were delivered with the message that high speed access to internet was needed for its knowledge industry and economy. If needed the municipalities should take the lead. This caused some problems for some non-profit bodies and commercial companies in setting up projects. With a ruling by the EC on the Amsterdam broadband project, basic rules were established. Non-profit companies like municipalities are allowed to stimulate broadband projects, but are not allowed to invest commercially.

NGA optical fibre-based networks enable bitrates several times higher than those currently available on traditional copper wire networks. NGAs are required to deliver high-definition content (such as high definition television) and interactive applications. The objective of a common regulatory framework for NGA is to foster a consistent treatment of operators in the EU and thereby ensure the necessary regulatory predictability to invest. The Commission is consulting on the basis of a draft Recommendation, addressed to the regulators in the 27 EU Member States and suggesting definitions for harmonized categories of regulated services, access conditions, rates of return and appropriate risk premiums. The public consultation will be open until 14th November 2008. The Commission will then finalise the Recommendation in the light of comments received and formally adopt it in 2009.

The deployment of NGA is indispensable to deliver new broadband services to European consumers. While a number of operators, both incumbents and alternative operators, have launched large-scale rollouts of new broadband infrastructure in a number of Member States, Europe appears to be still lagging behind other economies, notably the United States and Japan.

The basic principle of the Commission's draft Recommendation is that national regulatory authorities should provide access to the networks of dominant operators at the lowest possible level. In particular, they should mandate access to the ducts of the dominant operators allowing competitors to roll out their own fibre. However NGAs should also impose further physical access obligations (access to unlit fibre) beyond access to ducts where ducts are not available or the population density is too low for a sustainable business model. Access to active elements such as "bitstream" shall be maintained provided lower level remedies do not sufficiently address distortions of competition.

The draft Recommendation provides also a common approach to ensure non-discriminatory access, as well as a methodology for calculating a proper rate of return, including a risk premium. The Commission believes that for NGA, rates of return should be derived in the light of the risks associated with this kind of investment, bearing in mind that the nominal pre-tax weighted average cost of capital for fixed and mobile operators has been roughly 8 to 12% in recent years.

In the meantime the Finnish government has approved a proposal to provide all homes and businesses with internet access at 100 Mbps by 2015. The decision is based on proposals from a broadband study conducted by Harri Pursiainen, Permanent Secretary of the Ministry of Transport and Communications. The project aims to have a distance of no more than 2 km from each user to the nearest fibre-optic or cable network.

Blog Posting Number: 1230

Tags: , , ,

Tuesday, September 23, 2008

BPN 1229 iRex beyond e-books and e-Papers to the business segment

It was clear that Forbes had received all the information on the launch of the new iRex Digital Reader 1000 (see photograph). Al the details were correct. The launch did not have any surprises for the journalists present; I was not there. But the first reviews basically copied the press release; only a few journalists had been able to hold and touch the new reader and they were in a jubilant mood. ZDNet opened by saying that the new reader gave a Windows feeling; supposedly this is meant as a compliment. The digital lifestyle site and magazine Bright.nl sported the caption: Size does matter (we know that by now from all the SPAM mail). The manufacturer iRex Technologies went a step ahead in its press release by using the headline: iRex Opens New Chapter In E‐Reading. In my opinion this is too much as only colour or video would open up a new chapter. The introduction of the DR1000 is just a logical expansion on the iLiad.

So the iRex Digital Reader 1000 is there, or better two versions are for sale now. The specs:
Diagonal screen size: 10,2 inch (25,9 centimetres)
Screen resolution: 1.024 x 1.280 pixels.
Screen size: text on a A4 format can almost be shown 1 on 1, as the white margins have been cut out; developers have been compromising between the A4 paper size in Europe and the legal paper size in the USA
Dimensions: 27 cm x 21,7 cm x 1, 2 cm (!)
Weight: 570 gram
Operating system: open source Linux
Document formats: most common formats such as PDF, HTML, TXT, JPEG and PowerPoint.
Version DR1000: 499 euro (650 US dollar)
Version DR1000S with digital pen and write functionality: 599 euro (750 US dollar)

The specs look good and interesting. The Open Source Linux will attract diverse applications. The size is interesting in a business environment and will not be dependent on book formats or newspaper formats. The DR1000 is clearly intended for the business market as it can show documents in the most current formats. Missing from this new reader is the bar to flip the pages back and forth, which is available on the iLiad. I thought that this was a new reading tool, but apparently not for the business environment.

But the disappointment of the introduction was the absence of the DR1000SW with Wifi, Bluetooth and 3G functionality, which will be available later this year. No price was mentioned in the press release about this version; however Forbes indicated that it would cost 850 US dollar, which would translate to 699 euro. To me this would be the most interesting one of the set. So why not introduce this one at the same time. Or are the DR1000 and DR1000S just teasers to warm up the market? I think there is more behind this introduction. iRex Technologies is getting nervous as Plastic Logic has been much in the news sporting its reader with 10,2 inch diagonal screen, which is due to be released in the second quarter of 2009. So, yes the DR1000 and DR1000S are teasers to warm up the market. And iRex Technologies will have to launch the DR1000SW before the second quarter of 2009 to keep the initiative. But by doing so it ill also have to set the price of the device, offering Plastic Logic the opportunity undercut the DR1000SW price of supposedly 850 US dollar.

I will be saving in the meantime in order to buy either the DR1000SW or the Plastic Logic reader.

Blog Posting Number: 1229

Tags: ,,

Monday, September 22, 2008

BPN 1228 While waiting for the new from iRex

Today iRex Technologies, the manufacturer of the e-reader iLiad is said to unveil its first 10,2 -inch e-book reader for the business market tomorrow. The iRex site says that iRex Technologies will unveil a new thing. So I will be checking the site continuously today. But the magazine Forbes was able to unveil some details already, even with a picture of the e-reader (see photograph). The basic message is, according to Forbes, that iRex Technologies will release a large size reader with a screen, which measures diagonally 10,2 inch. It looks like Rex Technologies is going to shift gears in battery life, software and addressing target groups.

From the small Sony PRS 500 to the iLiad, it was already a step forward. But the move from the iLiad to the 10,2 inch screen of the iRex Reader 1000 is a giant step. Especially for the business sector this will be interesting as most documents are still A4 or legal size. Whether the Reader 1000 will be more interesting for newspapers will depend on the price; the first rumblings are that iRex Technologies will again be on top of the price list.

It looks like iRex Technologies is the only one in that field, if the announcement is correct and concerns the Reader 1000. But just a week ago Plastic Logic was also in the news showing a large e-reader with the size of 8.5 x 11-inch, diagonally measuring up to 10,2 inch. Plastic Logic will start selling the product commercially at the beginning of next year. If all the rumours around the iRex Reader 1000 are true, iRex Technologies will start selling immediately in the USA.

So, we are witnessing a new step ahead in size of the screen. It will also be an incremental step in battery life and software. But I will wait to see the specs. But I can hear the critics already saying that it is not a step ahead as the screen is still using 16 grey tones and not colour, while also video is still missing. I am not impressed by that type of comments as colour will come around by 2009, as indicated by Russ Wilcox, the chief executive of E-Ink. Video is said to be on the horizon by 2012. So far digital paper has proven to be unruly for web processing and certainly for video; a few years might be needed to get around the problems involved. Yet a large screen will already be a step ahead; the next logical step would not be web browser functionality and video, but flexible digital paper.

Blog Posting Number: 1228

Tags: , ,

Sunday, September 21, 2008

Update iRex to present 10,2 inch e-Book reader

iRex Technologies, the manufacturer of the e-reader iLiad is to unveil its first 10,2 -inch e-book reader for the business market tomorrow. The iRex site says that IRex Technologies will unveil a new thing, but the magazine Forbes had already more details.

The name of the new e-reader will be iRex Reader 1000. It will be able to use any file format including PDF, Word, and HTML-rendered documents, contrary to the limited file use of the iLiad. However there wil be no no video and no colour yet.

Thre will be three versions of the iRex Reader 1000. For the base version you will have to shell out 650 US dollar. For the verson with writable screen you will pay 750 US dollar. For the version with WiFI, Bluetooth and 3G the price is 850 US dollar.

Watch this blog for a new posting on Tuesday morning.

BPN 1227 European collecting societies quarrel among each other

The Dutch collecting society BUMA/Stemra has been denied the right to offer pan-European music licenses in no less than two courtcases. Recently Ms Neelie Kroes, Commissioner of the European Commission, announced that the collecting societies had to demolish their monopolistic structure of regional license areas. The first attempt of BUMA/Stemra has gone wrong badly.

On 25 August 2008, Mannheim Regional Court granted an interim injunction against the download provider Beatport.com as well as BUMA/Stemra. The injunction prohibits Beatport.com from making specific musical works from GEMA's repertoire available to the public over the Internet in the territory of the Federal Republic of Germany without having previously obtained the consent of GEMA. The German collecting society administers the copyrights of more than 60,000 members (composers, lyricists and music publishers) as well as those of over 1 million rights owners round the world. BUMA/Stemra is prohibited from licensing such use. Beatport.com has already recognised the interim injunction as the final ruling for itself.

The Dutch collecting society BUMA/Stemra had announced on 21 July 2008 that it had granted Beatport.com a Pan-European licence allowing Beatport.com to offer the entire worldwide repertoire of music - i.e. also including GEMA's repertoire - online throughout the EU. From GEMA's perspective, BUMA/Stemra is not entitled to do this, as it was granted the right to licence GEMA's repertoire only for uses within its own administrative territory. This standpoint has now been confirmed by Mannheim Regional Court. Like GEMA, the British collecting society PRS has also successfully gone to court against the EU-wide licensing of its repertoire by BUMA/Stemra to Beatport.com.

GEMA, for its part, is one of the leading collecting societies in the complex market of Pan-European online licensing. It has, for instance, recently set up a one-stop shop for the Europe-wide licensing of mobile and online use of the Anglo-American repertoire of SONY/ATV Music Publishing. CELAS, the company established by GEMA and the British collecting society, has already been licensing the Anglo-American repertoire of EMI Music Publishing on a Pan-European basis in the online and mobile sector since December 2007. With these models, GEMA can offer Pan-European licences to licensees for the use of extensive repertoires.

Three days earlier the Dutch judge agreed with the British collecting society PRS, that BUMA/STEMRA was not allowed to give Beatport.com.com a pan-European license for satellite, cable or internet. BUMA/Stemra pleaded that the regional limitations between the collecting societies, part of the Contract of Reciprocal Representation (CRR) from 1973, were not applicable on the licenses of online music as cross border music rights were not in use in 1973.

Dirk Visser, a Dutch lawyer and professor specialised in copyright, named the BUMA/Stemra pan-European deal with Beatport.com.com a provocation, saying that BUMA/Stemra has triggered a minefield.

Blog Posting Number: 1227

Tags: ,

Saturday, September 20, 2008

BPN 1226 Making online commerce a reality

Ms Neelie Kroes (see photograph), European Commissioner for Competition Policy, made closing remarks at the Online Commerce Roundtable in Brussels on 17 September 2008,

“Ten years ago The Economist proclaimed "The Death of Distance". It was right. Liberalisation of telecommunications services has meant that the internet is now within reach of the 500 million people in the 27 countries of the European Union. So consumers now have the internet, with global means of payment, and global distribution systems.

But consumers do not yet have global commerce. Far too often they do not even have pan-European commerce. Consumers are not happy with this. We have heard from one highly respected consumer organisation today, and I have heard similar complaints from other organisations and from individual consumers in the past. So have my fellow Commissioners.

Consumers see the internet, and the borders that exist online, and feel that they are not getting a fair deal. The internet gives more power to the individual than any technological change in history. We cannot let that power be taken away.

This goes beyond narrow commercial interests. The people of Europe were promised a union, a place without borders: but on the internet they have not yet got it. Progress has been made; sometimes impressive, but it is not enough.

There seem to be many reasons for this, some common to the online and the offline worlds, including tax systems, consumer protection laws, guarantees and after-sales service. My colleagues in the Commission are doing their best to address these, knowing that there is a lot more to be done.

But even in areas where these concerns have been overcome, consumers often find that the products they are looking for are not available to them. As Competition Commissioner, I want to know why.

- If this is because the competition rules are not clear enough, I will clarify them.
- If it is because the competition rules are not up to date, I will update them.
- And of course, if this is because the competition rules are not being respected, consumers and companies should know that I will enforce them.
- If the problems do not lie with the competition rules, but are due to the wider regulatory environment I will support my colleagues in the Commission to make any changes that are needed.

The purpose of this meeting is to begin a discussion with consumers and with companies on online issues. I want to hear more views, of course, from other people not present here today, which is why I am inviting others to send me their views. I will also publish a short report of today's meeting on which people can comment. Today is the start of a discussion, not the end of one.

And to help that discussion prove fruitful, I want to outline some impressions that I take away from this meeting, and outline what I intend to do next.

The Single Market of the European Union is based on a relatively straightforward premise. The fewer the barriers between markets, the more efficiently those markets will work. That is why we have spent fifty years trying to create a Single Market for goods and services, for companies and workers, and for capital. And that is why, once state and regulatory barriers have been brought down, the competition rules are there to ensure that state barriers are not replaced by commercial ones.

There are well established competition rules for companies that enter into distribution agreements. They are due for review next year, and we are working hard on that review.

These rules already have provisions for internet sales, and if I hear that these rules are not being respected, then I will look into these allegations immediately. And if I find any company to have breached the rules, I will ask the Commission to act and punish the companies concerned.

There are also questions, however, about whether these rules strike the right balance when it comes to restrictions on internet sales. Should a company, for example, be allowed to exclude internet-only retailers from its distribution system? I have heard today from companies who think that that is the best way to protect a brand image. I have also heard from companies that use internet only retailers but impose strict conditions on them. And I have also heard from consumers who believe that consumers should have the right to choose.

This is going to be an important issue in the forthcoming debate and I hope that we receive more evidence as to the effects of these restrictions.

There are also questions about territorial restrictions. It is a long standing principle of Community law that a company can prevent its distributors from actively selling across borders - this helps to protect investments and efforts made by other distributors. However a company cannot prevent arbitrage and stop its distributors selling - passively - to consumers who are themselves active, and who seek out the distributor.

This distinction between active and passive sales is fundamental - but questions have arisen as to what this means on the internet.

Since the rules were last reviewed, there are a number of practices which are being used by companies to restrict cross border sales which I think require a closer look.

Website redirection, and credit card checks, to name just two practices, may be permissible if decided on by the distributor itself. But if these are imposed on the distributor by the manufacturer, then that seems to risk limiting passive selling - and that is clearly an infringement of competition rules. I intend to look very carefully at these practices, and any others brought to our attention.

So when it comes to physical products, there seems to be room to do more to enforce the competition rules more rigorously to help consumers.

For digitally-delivered products, such as music, the position seems more complicated. But no easier to explain to the consumer.

Why is it possible to buy a CD from an online retailer and have it shipped to anywhere in Europe, but it is not possible to buy the same music, by the same artist, as an electronic download with similar ease? Why do pan-European services find it so difficult to get a pan-European license? Why do new, innovative services find licensing to be such a hurdle?

The answer, as we have heard today is complex. The rights are more complicated, the licensing agreements are more complicated, the issue, so everyone has told me, is more complicated.

The world is always more complicated than we would like it to be. But that is no excuse for inaction. Collecting societies and music labels have come a long way since 1851, the time of Bourget and his sugared water, but the world has changed around them. Artists have changed, distribution has changed, and consumers have changed. There is a perception, though, that the collecting societies and the music labels have not.

Collecting societies have a vital responsibility in looking after the interests of artists. That is only right because music is a vital part of our society and our culture. It always has been and it always will be. But where regional monopolies are not necessary - in the online world - then I want to hear more about whether the current system really helps the artists and whether it serves the consumer.

As you know, I have been trying - through a range of cases in the telecoms and music sectors amongst others - to make the Single Market a reality for new products and services. Today's debate has provided me with useful input to better understand this market and see what needs to be done in the future. If the competition rules are breached, you know already that I will continue to be active. If other changes are needed, I will support my colleagues in making the necessary changes. However I believe that the music industry can reach sensible solutions, allowing simple, workable licensing systems to be created.

Historically, the copyright system has always found a solution for dealing with complex licensing issues and technological change. Indeed the collecting societies themselves developed to solve just such problems.

But if a solution to the problems we face today is not found, then the music industry can hardly complain if regulators or enforcers step in.

I want again to thank all of you for your contributions, and to repeat that this is the start of a discussion, and not the end of one. Each of you now has an opportunity to submit more detailed comments in writing, and I will be inviting third parties to do the same, and to comment on the report of this meeting that will be prepared".

Blog Posting Number: 1226

Tags:

Friday, September 19, 2008

BPN 1225 Pre-internet (11): The arrival of internet

While the online industry was developing and coping with disruptive technologies like videotext and CD-ROM, a new online phenomenon was developing in academic circles. It started out in 1969, when the ARPA project was started by the US Department of Defence (DoD). It was the Cold War with a division between the two superpowers of the USA and Russia. In order to be able to keep a network alive after a potential bombing or worse nuclear bombing, a new network had to be designed, which would leave the not –hit network part operational. This required a new network protocol. By 1971 it lead to the packet switching telecom technology. It also yielded a new way of co-operation between the Department of Defence and the universities, which executes assignments for the DoD, and between the universities mutually. A number of networks came into existence, which were eventually based on a new series of protocols, TCP/IP (Transmission Control Protocol over Internet Protocol), developed by Bob Kahn and Vincent Cerf between 1972 and 1976. The protocols took care that networks could exchange electronic mail and information. By 1986 The DoD started talks to move the Internet over from the department to the National Science Foundation (NSF). The Internet as it existed in the USA became an interrelated network of networks which was expanding to universities and research centres in other countries.

The network was mostly in use for e-mail, for file transfer (Ftp) and other facilities such as Usenet. However there was no overlay in the system which made it easy to jump from one function to another or from one server to another. This was for the Brit Tim Berners Lee, a researcher at the European Particle Physics Laboratory CERN in Geneva, the moment to start thinking about the World Wide Web in 1990.

Tim Berners Lee defined the first web client and server in 1990 with specifications of web addresses (URL, Unique Resource Location), links (HTTP, Hyper Text Transfer Protocol) ) and the mark-up language (HTML, Hyper Text Mark-up Language); HTMP was based on the ISO standard 8879 of 1986 known as the Standard Generalised Mark-up Language). In 1991 he developed the first pages on a Next machines.

With this material browser-like products were produced such as Gopher, a distributed search and retrieval protocol. Important was the text menu. It was soon to be superseded by Mosaic, which was a user-friendly interface. This web browser took the computer world by storm and popularised the World Wide Web, soon to be followed up by Netscape Navigator.

The online industry with ASCII databases and videotext services were taken by surprise and had problems understanding the depth of the change. Internet expanded fast and the traditional online industry had no answer. Databases sprang up like mushrooms and coming from the academic world were available at no costs. Information providers of ASCII databases were able to convert their data to the HTML, and later to XML standard. Yet the pioneer host Dialog was sold off to the Canadian publisher Thomson. But the videotext services were hard to convert due to its page structure and the videotext information providers were left in the cold and only a few were able to cross over to internet. Yet a completely new industry with new players came up, all using the same protocol. The scientists could put their databases online, pre-publish their articles and universities could build up repository of the PhD theses and scientific articles. Business started sites and company promotions and online business became a new line of trade. And the tifosi of the bulletin board systems made the cross-over easily and started their own sites. The internet protocol gave any target group the opportunity to get online.


Blog Posting Number: 1125

Tags: ,

Thursday, September 18, 2008

Pre-internet (10): Frozen online

The online industry was in development in the seventies and the beginning of the eighties. But online was expensive, especially for intermediaries in science, scientists and business people. On the other hand information providers, usually publishers of scientific, technical and medical information (STM) wanted more penetration for their electronic information. So they looked around for other carriers than online, especially in the optical field. By the end of the seventies the laser disc had been launched as a consumer product for film, but it lost its race against the video tape. But in the STM world there was interest in the laser disc. In the early eighties a consortium of STM publishers, among which Elsevier Science and Blackwell Scientific, formed the consortium Adonis to produce and distribute scientific articles on laser disc.

However this technology was superseded by the CD-ROM technology in 1984, when Philips started to produce CD-ROM. The carrier was seen as an adequate storage medium with 600Mb. For many STM publishers this was sufficient space to store many of their text databases. Besides the publishers could promote and handle this medium themselves. However in the first year there was a problem with the logical file structure, which bound a product to a particular brand of CD-ROM player. So twelve hard- and software parties came together and established the High Sierra format, which was turned into the ISO 9660 standard in 1988. From the High Sierra format onwards, any information provider could deliver a silver disc for any CD-ROM player. And the STM publishers and information providers made use of it as it was much the same type of subscription management distribution as magazines and books.

To the online industry CD-ROM was a disruptive technology. CD-ROM could hold 600Mb of data, which is a lot of text. And as most of the databases were archive databases putting the database on a CD-ROM was cheaper and more profitable than putting the material online. Besides, with CD-ROM no taxi meter was running in the back of the mind of researchers. And for information provider with timely information, h could choose for a hybrid distribution model offering timely information online and the archive on disc. This business model was adopted by STM publishers and information provider a well as business publishers.

The Dutch publisher Kluwer had started a commercial online service in 1980 and started to experiment with CD-ROM in 1987. By 1988 it published the legal database on the silver disc, while it used online only for timely matters. The Royal Tropical Institute ran it database Agris on two host computers/servers. Its audience were intermediaries in countries with problematic telephone networks and with high costs. So when he institute started to distribute the disc KIT Abstracts, it reached more subscribers than online.

Although many a publisher believed that the combination of online and CD-ROM had a healthy life expectancy, the business model was over by 1995 and the publisher and information providers had to cope with another disruptive technology: internet. Companies like Kluwer had to invest again. And Elsevier Science started the radical idea to bring all its publications, including its article archives, into the ScienceDirect database. By 2000 STM text CD-ROMs were over and online had been defrosted.


Blog Posting Number: 1224

Tags: ,

Wednesday, September 17, 2008

Update Three Dutch nominations in Europrix Multimedia Awards 2008

Among the 23 nominations of the 2008 edition of the Europrix Multimedia Awards, are three nominations of Dutch designers. The nominations have been selected out of 338 entries from 32 countries. The winners will be presented at the Gala in Graz (Austria) on 29 November 2008

The Dutch nominees are:

Title: Images of the Street
Designer: Sandra Karis (NL)
School: Utrecht Graduate School of Art, Media, Music and Technology
URL: http://emma.hku.nl/

Title: Kika and Bob
Designer: Fons Schiedon (NL)
Company: Submarine
URL: http://www.kikaandbob.com/



Title: Treehuggers, A Studies in Immersive Animation
Designer: Gatze Zonneveld (NL)
School: Utrecht School of the Arts
URL: www.don-quixote.nl/pages/treehuggers.htm

Update NRC Handelsblad starts international site in English

The Dutch quality paper NRC Handelsblad has started an English language site in co-operation with Spiegel Online. The Dutch site contains news from NRC Handelsblad, but will also publish items by DutchNews.nl. See video.

BPN 1223 Pre-internet (9): Videotex outstripped by the time and technology

Videotex was a European technology. It was heavily stimulated by national politics and by the European Commission; grants were provided to promote research and development of the technology and market. Despite the support, videotext had a varying success.
For Great Britain the whole videotext adventure lasted 20 years from the start in 1971 till the sale of assets in 1991. Despite the fact that Prestel made use of common household devices such as the telephone and the television, it was still an expensive proposition. As a result, Prestel gained a limited market penetration among private consumers achieving a total of just 90,000 subscribers. But Prestel also received competition from other BT value added services like the combined e-mail and database system Telecom Gold.
In The Netherlands the whole technology also existed just twenty years, from 1976 (its first demonstration) till 1996 (the end of Videotex Netherlands. By that time the technology was over and done with. In the end roughly 350.000 users had made use of the public systems Viditel and its successor Videotex Nederland.
Other European countries, such as Italy, Austria, Yugoslavia, Hungary, Germany bought the Prestel system. Italy registered 180,000 subscribers.
Remarkable is that the success of the French Télétel/Minitel was never repeated in any other country, let alone by any other videotext system. Yet Prestel had more success with selling the system than Télétel/Minitel. The success in France is often attributed to the free handing-out of the Minitel terminals. However I think that the Kiosque model was the key to success. The Kiosque was not regulated by the French PTT and the information providers felt more involved.
But also outside Europe videotext was no success. Singapore used a variation on the Prestel system by using a telephone line for prompting the system, while cable was used for downloading, providing a higher speed, allowing the transmission of pictures. Yet the system did not get out of the starting blocks and the same goes for the American trials of Green Thumb and Knight-Ridder. But system Telidon in Canada and Captain in Japan never got of the drawing board at all.

Part of the problem of penetration was also the colliding technologies. When videotex came onto the market, the PC was introduced in companies and bought as a toy by amateurs. They found out that the videotex technology and mini-computers and PCs were incompatible. All kind of conversion programs had to be written to transfer information from videotext devices to PCs for processing and storage of data.
Another interesting fact is to see that operational videotex remained limited to Europe. On the other hand in the United States the residential services like The Source, CompuServe, Prodigy and AOL also struggled to get enough market penetration. One should not forget that the seventies were a time of technology change with the introduction of computing and the first sales of PCs. In the eighties consumers were discovering how to make sense out of all these new technologies and how to cope with these new devices.

Could videotex have won the technology race against ASCII databases, e-mail and bulletin board systems? I personally do not think so, as the videotext technology was too much of a suit of armour, limiting the information provider and user. Besides the retrievability of information was too limited due to tree structures and page oriented navigation.

British Telecom, the telephone company spun out of the British Post Office, was early to abolish Prestel in 1991. Other countries were later. The Netherlands dumped Videotex Nederland on January 1 1997. In France Télétel/Minitel is dying out as the number of information provider declines with 30 percent a year. In 2005 there were still 6 million of Minitels (at its height it were more than 20 million), still yielding 351 million calls for 18.51 million hours of connection, generating € 206 million of revenues, of which € 145 million were redistributed to 2000 service providers. But the success of Télétel/Minitel has had also a drawback with the introduction of internet in France. In the nineties France turned out to be the first country on gaz and the last on electricity. France was one of the laggard countries in adopting internet.


Blog Posting Number: 1223

Tags: , ,

Tuesday, September 16, 2008

BPN 1222 Pre-internet (8): videotext to conquer the world

Videotex as created by the British Post Office started to promote the system around the world, but it also sparked variations on the system. The rush for a consumer system was on. Although the PC had come to the market being pushed by the company Hewlett & Packard (HP) and by the newcomer Apple, it had not gained speed as a consumer product. So the was still a window for a television oriented text service for residential (left) and professional (right) use.



USA
The British Prestel system was migrated and adopted to the American television system. The American government department of Agriculture and Trade initiated a 16 month long pilot in the state of Kentucky under the name Green Thumb. Another trial, with a different system, was held in the region of Miami by the newspaper conglomerate Knight-Ridder. Both projects were discontinued with the rise of information services like The Source and Compuserve as reason.

Japan
In Japan a trial was started in 1978 with a system, dubbed Captain (Character And pattern Telephone Access Information Work). 1.000 devices, ready to receive text, were handed out. The information was sent out from a central system through the telephone line to the consumer’s device. The television set had a decoder with a memory of 64k. the screen pages existed of only 8 rules with 15 symbols. The Captain system could handle the Roman alphabet as well as the Kanji, Hirangana and Katakana symbol set. The trial involved a daily newspaper.

Canada
In 1979 Canada showed its own videotex system, dubbed Telidon. Over the British Prestel system, the Telidon system had as advantage that it had more graphical opportunities. Instead of a mosaic only having six small squares like in the British Prestel system, the Telidon letters and symbols were made up with points. A car wheels would be less square in Canada than in Britain. The Canadian trial lasted from 1979 till 1982. During the trial 100.000 pages were entered. The trial encompassed ca. 1.000Bell Canada paid the experiment.

The Netherlands
The British videotex system was shown in the Netherlands in 1976 during a conference of cable operators during a secret meeting. By 1978 the Dutch PTT announced that it was going to introduce the Prestel system. The Dutch publisher VNU tried to reach a deal with the Dutch PTT in order to control the consumer and professional market, but the Dutch PTT got out from under this agreement. On August 7, 1980 the Dutch PTT started the videotext service, dubbed Viditel (I see from afar). One year later VNU got its videotext computer, but never was able to make it profitable with professional services. In order to get a better penetration for videotex, VNU started Ditzitel, a project with videotex technology but with cable transmission instead of telephone. Ditzitel never got the technology right, but by the end of the eighties the technology was operationel.

France
France had a different point of departure. The country needed a new telephone network. Plans for this were put on paper in 1975. But it became not only a technical specification of a network, but the paper also contained ideas about information services. The Centre National d’Etudes de Télécommunication (CNET) developed a terminal under the codename TICTAC (Terminal Intégré Comportant un Téléviseur et l’Appel au Clavier), which later on became known as Minitel. In order to stimulate the use, a program for free distribution of 1 million Minitels was set up. But the French PTT also thought about content. It planned to replace the printed regional directories with a total of 34 million telephone numbers by an electronic directory, with the advantages of 24 hours availability and the passing out of the print edition. The electronic directory was launched in 1981.
The French videotext system Télétel became a success. This is often ascribed to the free handing out of the terminals Minitel to stimulate the use of the telephone directory. But not all Minitels were free. The success can rather be attributed to the distributed network organisation, to which information providers could hook up, and the freedom of information providers using the distributed network.
The French Télétel/Minitel became a success with millions of Minitels around. They generated millions of online connected minutes, amongst others with sex messages via the messagerie rose, the pink e-mail service.

Blog Posting Number: 1222

Tags: ,

Monday, September 15, 2008

BPN 1221 Pre-internet (7): Videotex, tree content

Videotex content
Videotex differs from ASCII in presentation and linking. Videotex is a presentation protocol and not a communication protocol like ASCII. While in ASCII the rules fill the screen from the top, videotex presents the information as a page. All those pages are like a mobile artwork from the ceiling, but a mobile with nine levels. The top two levels (0 to 9 and 01 to 99) were destined for system activities. The third level (001 to 999) was intended for starting pages. An information provider could use six levels to distribute his/her information. Every page could expanded with add-on pages from a to z. Via a tree menu one could search for information and by keying in page numbers one could reach a page directly; in this way it was possible to use the system for timely information by keying in a date like 800915 (15 September 1980). In this way it was also possible to search sideways. Searching like in ASCII databases by keywords was not possible, unless the information provider put up a list of controlled keywords as links.

Illustration of tree structured pages:
- top two systems levels;
- level three is starting page;
- information pages;
- add-on pages.

Every page had 24 rules of 40 positions (letters, figures, symbols and diacritical signs). Every position was built up in a small mosaic of sic squares, which could be used to produce rudimentary graphical representations such as cars (with square wheels). Text as well as graphical representations could be embellished with one of seven colours (white, black, green, blue, red, yellow, magenta).

The page orientation entailed consequences for the writing of content. The author was limited in text by the space of a page and always had to think about navigation. Every page needed links to go back to the level above or to the starting page as well as the exit page (which was hardly used), but there was also a need to lead the user in the navigation from top to bottom, from the bottom pages up as well as sideways by using index pages (see illustration).

Videotex was seen as a consumer information system. The first British Prestel system contained 16 sections:

Buying a Car------Financial Information
An Evening Out -- Market Intelligence
Houses for Sale---Business Intelligence
Local Information-Community Services
Social Guidance---Route Planning
Looking for a Job-Holiday Information
Entertainment-----News
Education---------Sports Results

Source: The Viewdata Revolution by Sam Fedida and Rex Malik; Associated Business Press, 1979

Organisation
The organisation of the videotext service was similar to the ASCII database service, except that the videotext services in the eighties were claimed by the national mail and telephone services as an extra service and source of revenues. In the UK the British post office claimed the service, in The Netherlands the Dutch PTT, in West-Germany the Bundespost and in France the French PTT.
But soon there came a difference between the organisation of the service following the British Prestel model and the French kiosque model. In the Prestel model the PTT controlled the system, did the marketing and publicity and handled the revenues on behalf of the information providers. The PTTs even got involved, for the first time, in content for the first time by claiming the common index and keyword maintenance.
The French PTT was less in control and left the content business to the information providers and publishers. The French PTT was involved with maintaining the system and the network as well as handling the revenues. But the information provider and publishers were more independent in setting up their services, including the technical side. They could link their own computer to the central computers and use the service like a newspaper kiosque, promoting their own products. This has been a key success factor in the promotion of the French Teletel project.


Blog Posting Number: 1121

Tags: ,

Update: Butterfly Tattoo in world premier showing

Last Saturday the movie Butterfly Tattoo, based on a novel with the same title by the British author Phillip Pullman, went on screen for the first time during the movie festival Film by the Sea in Vlissingen (Flushing) in The Netherlands. The production team and the cast were present. I have written before in a posting about this movie (and my small investment in the movie).

The photograph was taken after the showing of the movie. The three people from the left are the members of the production team and the the three people from the right are the two main actors and the producer.

There are no reviews yet. Personally I found it remarkable that such a low budget movie had such a professional look. The movie was dramatic, emotional and entertaining with humour and fine music. It is also a very British movie in its language and the theme of class distinction. The pace of the movie could be speeded up in some parts, but the movie was not boring.

Presently the movie The Golden Compass can be seen in the theaters. This movie based on a novel by the same author was produced for 16 million US dollar. The Butterfly Tattoo was produced with a budget of a little over 200.000 euro. The money was picked up in two days after a front page article of the Dutch financial daily FD, describing the project of three students. The production team is presently negotiating about the distribution rights.

Sunday, September 14, 2008

BPN 1220 Pre-internet (6): Online from Europe by television

Online was started by the Americans using computers. Information was shown on a terminal screen or on paper. But another development came from television. The basic idea was that a central computer would send textual and rough graphic information to a television screen. The information was transported by television waves at first and later by telephone.
The US company RCA started the development of a text system for television under the name Homefax. In 1971 Mitre Corp. started the first text tests bundled on a television wave. However the company was passed in its ambition by the British broadcast company. The BBC announced that it had developed a system Teledata in order to broadcast news in text form to an adapted television independently of the programme broadcasted at that time and without interrupting the program. On the other hand the texts could be coupled with a television programme for subtitling or translation. BBC described the system in 1970 in an internal memo of December 14, 1970 and a patent was applied for on February 9, 1971. The BBC named the system Ceefax (see facts) and started trials from 1974 till 1976. An official Ceefax editiorial staff was operational from 1976 onwards. However the success was dependent on the number of adapted television sets (10.000 sets in 1978 and 40.000 sets in 1980). The commercial broadcast company started a teletex service in 1973 under the name Oracle (Optical Reception of Announcements by Coded Line Electronics). Both systems started to present the weather, stock quotes, news- and sports items.
The British system was exported to the Netherlands, West-Germany, Flanders in Belgium, Sweden, Denmark, Austria, Australia and Hong Kong. France developed a system of its own under the name Antiope (Acquisition Numérique et Télévisualisation d’Images Organisées en Pages d’Ecriture). There were also other standards developed like in Japan (Captain) and Canada (Videotron).

From teletex to videotex
The development of Teledata by the BBC was followed with great interest by Sam Fedida of the British Post Office Research Department. In 1970 he combined the Teledata system with the Telephone system and developed the viewdata concept, an online system via the plain old Telephone system with the television as information delivery system. (The generic system name viewdata became obsolete due to depositing the brand name and was dubbed videotex – the Latin word combination for I see text). With the consumer in mind and the absence of a personal computer – which only was introduced from 1995 onwards -, the system was seen as the Volkswagen of the online industry: a costly central computer, a common telephone network and a mass consumption television as delivery station. Only a modem and decoder were needed, but these devices would become cheap through mass production. The advantage of the telephone transmission was the interactivity between the user and the central computer; with the transmission through television waves information could only be sent one way.
Sam Fedida made his first presentation on January 13, 1976. In the same year the UK Queen Elizabeth II sent a first message by videotext. In October 1978 the first test service was opened on the Waterloo computer in London. In the same year the Financial Times and the financial information company Extel started the financial service Extel. In September 1979 the British Post Office officially started the commercial videotext service Prestel (see illustration of opening screen of the service) for business and consumer services.

Cover of the book book Viewdata Revolution by Sam Fedida and Rex Malik; Associated Busines Press, 1979

Blog Posting Number: 1220

Tags: ,

Saturday, September 13, 2008

BPN 1219 Pre-internet (5): Consumer services

Online was not exclusively the domain of scientists and business men. Small and middle large companies as well as well to do private persons were interested and looked for timesharing services from 1969 onwards. However, also consumer services were set up some 10 years later, from 1979 onwards.

Consumer services
Some companies with a computer saw an opportunity to make money by selling idle computer time at night. It was the start of consumer online companies. In 1969 the online service Compuserve started in this way. While best known for its consumer services division, the CompuServe Information Service, CompuServe was also a world leader in other commercial services. Another consumer online company was The Source, which offered e-mail and chatting from 1979 onwards. Compuserve started to offer e-mail in 1979 and chatting in 1980.

The Source (Source Telecomputing Corporation) started service in 1979 as an online service, one of the first such services to be oriented toward and available to the general public. Intended for use with 300 bit/s and 1200 bit/s dial-up telephone connections, The Source was text-based for most of its existence. At its peak, The Source had 80,000 members. During much of its existence it charged a start-up fee of about $100 and hourly usage rates on the order of $10 per hour. It provided news sources, weather, stock quotations, a shopping service, electronic mail, various databases, online text of magazines, and airline schedules. It also had a newsgroup-like facility.
In 1989 Compuserve acquired The Source, which turned out to have many ghost accounts and dismantled its competitor. Eventually in 1997 the pioneer consumer online service Compuserve was acquired by America On Line (AOL)

Prodigy Communications Corporation (Prodigy Services Corp., Prodigy Services Co., Trintex) dates back to 1980 when broadcaster CBS and telecommunications firm AT&T formed a joint venture named Venture One in Ridgewood, New Jersey. The company hoped to introduce a Videotex-based TV set top device that would allow consumers to shop at home and receive news, sports and weather. After concluding the market test, CBS and AT&T took the data and went their separate ways. In 1984 Prodigy was founded as Trintex, a joint venture between CBS, computer manufacturer IBM, and retailer Sears, Roebuck and Company. CBS left the venture in 1986. The company's service was launched regionally in 1988; a nationwide launch followed in1990. It was the second largest online service provider, with its 465,000 subscribers trailing only CompuServe's 600,000. Under the guidance of editor Jim Bellows, Prodigy developed a fully staffed 24x7 newsroom with editors, writers and graphic artists intent on building the world's first true online medium. The initial result was that Prodigy pioneered Internet portals - a single site offering news, weather, sports, communication with other members, and shopping for goods and services such as groceries, general merchandise, brokerage services, and airline reservations as well as lifestyle features, including popular syndicated columnists, restaurant surveys, Consumer Reports articles, test reports and games for kids.

AOL (formerly America Online, Inc.) is an American global Internet services and media company. It was founded in1983 as Quantum Computer Services and franchised its services to companies in several nations around the world or set up international versions of its services. AOL offers millions of customers around the world to access the world's largest "walled garden" online community and eventually reach out to the internet as a whole. In January 2000, AOL and Time Warner announced plans to merge and the deal was closed on 2001. AOL began life as a short-lived venture called Control Video Corporation (or CVC), founded by Bill von Meister. Its sole product was an online service called Gameline for the Atari 2600 video game console after von Meister's idea of buying music on demand was rejected by Warner Brothers. Subscribers bought a modem from the company for $49.95 and paid a one-time $15 setup fee. Gameline permitted subscribers to temporarily download games and keep track of high scores, at a cost of $1 per game. The telephone disconnected and the downloaded game would remain in Gameline's Master Module and playable until the user turned off his console or downloaded another game. In 1985 launched a dedicated online service for Commodore 64 and 128 computers, originally called Quantum Link ("Q-Link" for short). The joint venture with Time Warner has not delivered the content engine from Time Warner to AOL,so that it could become a real new publisher.

Bulletin board systems
For the real amateurs the electronic messaging service Fido net became available. This e-mailsystem was developed by the American Tom Jennings, who named the network after his dog (see the opening screen of Fido-net). In 1984 he succeeded to realise a computer network for PC amateurs, which was a fraction of the subscription to The Source and Compuserve. The core of the success was in the bulletin board software. Copies were multiplied and distributed freely to PC amateurs as long as they wanted to function as a system administrator. In this way more nodes in the network were created along which messages could be sent at night against tariffs for down hours. Amateurs could call in at a local telephone rate. The messages could also be sent to other networks. But no guarantee of receipt was given. When internet was introduced the population moved over from bulletin board systems to internet without any troubles.


Blog Posting Number: 1219

Tags: bulletin board system, walled garden, portal, ,

Friday, September 12, 2008

BPN 1218 Pre-internet (4): European databases

Also Europe was very early active with online. In 1969 Lockheed performed an assignment for the European Space Agency (ESA) in Frascati (Italy) on behalf of Lockheed. The company installed the NASA database RECON. From 1971 ESA started as ESA/Information Retrieval Services the first online service with NASA RECON en Chemabs databases for scientific and technical research. The service could position itself thanks to the cheap telephone tariffs of Tymnet.

The ESA initiative spawned online services in many European countries. Usually it were the governments which stimulated the usage of online services for scientific and technical research.

In Western Germany the government published in 1974 a program for the promotion of information and documentation. One of the plans was to found a Fachinformationszentren (FIZ) for the disciplines environment, technology, patents and research. FEZ started in1977. In 1978 the online service INKA was launched and in 1982 it already had 40 databases, among which the medical database DIMDI. In 1983 a distributed online service was launched under the name STN International; it linked the FIZ databases, Chemical Abstracts in the USA and the Japan Centre for Science and Technology (JICST).

In France the scientific and technical host Questel was founded in 1975. In the following years the online service has specialised in patents, IPR and brands databases. In 1994 the US online service Orbit was acquired from the legacy of Robert Maxwell’s media empire.

The UK was seen in the seventies as a stepping Stone to Europe for the US online information services. Due to the success of the ESA services, Lockheed’s online service, in the meantime named Dialog, saw possibilities to sell their databases in Europe. Roger Bilboul with his company Learned Information (in 1994 bought by VNU) was asked to be the representative for Lockheed’s Dialog in Europe.
BOC DataSolve was from origin an UK company, the computer department of the this multinational. The computerised documentation department developed itself into a an online information service which was sold to the television and record company Thorn-EMI. The service developed into a media host. It contained the complete full text of for example the Financial Times and the radio broadcasts such as BBC World. All those texts, including those of the newspapers, had to be retyped. In 1984 the Financial Times bought the service and changed its name to FT Profile, which was bought by Lexis/Nexis.
In 1979 BPCC, the media company of Robert Maxwell (see photograph), started to profile itself. Through its scientific publishing company Pergamon, Maxwell bought the rests of the bankrupt Infoline to form the scientific online information service Pergamon-Infoline. In 1987 the US service Orbit was added and continued as Pergamon Orbit Infoline to be divided up in 1991: Orbit was acquired by Questel and the rest was sold off.

In Switzerland a business information service Data-Star was launched in 1980. the service was financed by Radio Suisse. The online service bought the BRS search software, but made the mistake to buy a version which it could not maintain itself. The service was eventually bought by the Canadian publisher Thomson, which also bought Lockheed information service Dialog.

The online industry
The online industry in the US got its recognition in 1977, when Online Inc. (see team on photograph) started the magazine Online and organised a first online conference and exhibition. The conference and exhibition were mainly visited by librarians, online intermediaries and business intelligence users. The business grew fast from 1977 onwards as PCs were introduced by HP and Apple.

Online got a face in Europe when Learned Information started the magazine Online Review and the annual Online Conference in London in 1977. This conference and the exhibition were aimed at documentalists and online desk researchers. In 1980 online in Europe got a stimulus from the European Commission with the launch of the Euronet Diana network (see photograph). Through this network of the European Economic Community the European countries were connected and ready for a European online industry. Besides offering a network, the European Commission stimulated the development of databases with grants. In 1983 VNU Business Publications in Londen received a grant to develop IDB Online, the first daily electronic newsletter for the computer industry.

Next Previous

Blog Posting Number: 1218

Tags:

Thursday, September 11, 2008

BPN 1217 Pre-internet (3): Information retrieval industry

In a short time online services became commercial companies. Heart of the online service was the portfolio of textual and numerical databases and the search engines. The first databases consisted of a collection of abstracts from scientific, technical or medical articles (secondary information). But soon databases were published with full texts of laws, verdicts and newspapers (primary information). Also numerical series like stock quotes were databased. All these databases were made accessible for search action by an information retrieval program.

The texts of these databases were processed to various kinds of indexes, so that words could be found outside its context and in its context. But also controlled key words could be searched. With the search engine words and keywords could be combined by the Boolean operators AND, OR, NOT. But the retrieval program could also search on words and adjacent words. This type of searching was called full-text retrieval. Besides search facilities there were user friendly facilities like highlights, highlighting the requested terms; the highlight was introduced by Mead Data Central.

Searching was an art. The low modem speed, the low processing of the central computer, the connected time and the tariffs of data lines function as a ticking taxi meter for the user. By training the users learned methods to search fast and they passed on tricks to each other. Professional searchers were like terminal machinists, who competed with each other to reach the best results in the shortest time and against the lowest tariffs.

The organisation of a commercial ASCII information service was fourfold: a computer centre, a sales organisation, one or more information providers and a user. The computer centre took care of loading databases and maintaining the service and the telecom facilities. The sales organisation sold access to the service and the databases and took care of the marketing and trainings. The information providers offered one or more databases to the online service. The user sought access to the information service and searched the databases.

The tariffs were composed of several items. The user paid a subscription to the service, connected system time, a copyright royalty for the use of the database. Separately the user had to pay connected time to the telecommunication company. From the beginning it was a production oriented tariff. But marketing came in when BRS came into the market with a bang as it offered search for a fixed tariff, where the other ones offered time and royalty based tariffs.


Blog Posting Number: 1217

Tags:

Wednesday, September 10, 2008

BPN 1216 Pre-internet (2): USA commercial online information services

In 1964 Lockheed set up Information Sciences Laboratory under the direction of Roger Summit and purchased one of the first IBM 360/30 third-generation computers. By 1966 NASA gave Lockheed the assignment to mount the NASA RECON, a 30.000 item database on the internal information system of Lockheed. Bunker-Ramo company got at the same time the assignment to develop a dial-up service for this database. By 1967 Lockheed was awarded the contract to run the database on behalf of NASA and was granted the ownership of the information retrieval program (search engine one would say these days).

But Lockeed was not the only company working on an information retrieval service. System Development Corporation (SDC), a subsidiary of the think-tank RAND and run by Carlos Cuadra, developed in 1968 a retrieval system and demonstrated the principle of it with the ERIC, a database consisting of abstracts of educational articles. By 1969 SDC officially presented the retrieval program ELHILL using a 15.000 abstracts on Parkinson disease.

Telecommunication was changing by 1970. The packet switching was introduced and new telecom companies were set up for this, while old companies like Teletype tried for low-cost, low-speed information delivery. Low speed was 110 baud (roughly 110Kb). So remote companies would be able to link to a central hosting organisation and pick up the information needed. SDC got ready for the market seriously with the information retrieval program ORBIT. It went after a contract to back-up the internal NLM’s installation of ORBIT, but also negotiated the external exploitation of the earliest medical database MEDLINE.

TI/5/2
1/5/2
78329582
National (Netherlands) network for measurement of air pollution

van Egmond N.D.
NETHERLANDS
INTERMEDIAIR (NETHERLANDS) 1978, 14/4 (40-45), Coden: INTDB
Languages: DUTCH
Since a few years measurement of polluting substances has been carried out outside the urban and industrial areas. The national network for pollution measurement, which became operative in 1975, as well as additional mobile measurements have generated data from which it can be concluded that in rural areas lying at a great distance from the urban and industrial centers, relativelv high pollution levels occur. In large parts of the Southern Netherlands in particular, concentration which equal those in the Northern and Western industrial parts of Holland have been registered. This is caused by pollution transport from large industrial areas; particularly the source areas in the GFR and Belgium appear to he responsible for the high concentrations of SOsub 2 which were measured under favourable weather conditions in the South.
Tags: CHEMICAL AND SEROLOGICAL ANALYSIS AND PR(0102); GEOGRAPHICAL ASPECTS(0401); 4021(4021)
Descriptors: *air monitoring(0061833); *sulfur dioxide(0046727); netherlands(0032705)
Identifiers: national network;
Section Headings:
04604010100-ENVIRONMENTAL HEALTH/MEASUREMENTS OF POLLUTION/Air/Ambient air


A secundary information record of an article on pollution retrieved from Dialog

But SDC and Lockheed were battling out fights with their retrieval systems as internal and back-up systems. The fights were also between two strong-willed pioneers in information retrieval: Roger Summit (photograh left) of Dialog and Carlos Cuadra (photograph right) of SDC. Faced with a potential loss of the NLM database Carlos Cuadra wanted to escape this situation and started a survey to market to see whether there was a commercial market for online services. He sent out 7000 forms, but only 72 forms were returned. The message looked to be clear: forget about a commercial market for online information. But on the other hand he saw his competitor Lockheed’s Roger Summit making a commercial offering of government’s databases like NASA RECON, Nuclear Science Abstracts and ERIC; he even had picked up Pandex, the first commercial, non-governmental database. Charles Cuadra could not see this and went commercially online.

In fact online had become a race as Mead Data Central, a subsidiary of a paper mill company, went online with hr legal database Lexis; the company had picked up experience in the field by digitising the full texts of the legal verdicts for the Ohio Bar Association. Later on the service Nexis was added to digitise and commercially exploit the full text of newspapers like the Boston Globe en de Philadelphia; Nexis acquired the database of the renown New York Times in 1979.

But the competition did not stay reduced to three major players in the US market. In 1977 BRS (Bibliographic Retrieval Services), a privatised online medical service of the state New York, came on the market and was bought in 1980 by Indian Head, the media unit of Thyssen Bornemisza Corporation.

The information industry was oriented to information retrieval. The e-mail business was a separate business handled mostly by computing companies like General Electric, which ran the e-mail network GEISCO. In the eighties combinations of e-mail servers and online databases started to appear on the market. One system was developed by Westinghouse, which produced a system that was used in the information service NewsNet in the US and the office service Telecom Gold in the UK. VNU BPG (London) used both systems in 1984 to deliver a newsletter edition by e-mail and offer retrieval facilities on the database end.


Blog Posting Number: 1216

Tags:

Tuesday, September 09, 2008

BPN 1215 Pre-internet (1): Before internet

This is the first instalment of a series of articles dealing with the pre-internet period, ranging from the seventies till the nineties. The industry which came about catered to information retrieval for professionals such as scientists and business intelligence people, messaging for business, but also to consumer services. The USA developed in a different way than Europe, which went its own way in videotext. By 1985 the online industry was in for a surprise with CD-ROM; this was frozen online. In 1991 internet was introduced, it brought unity in online with the HTP/IP protocol; it was also the beginning of a new mass medium.

For many people online is synonymous to internet. However the origin of online had nothing to do with internet or with its predecessor ARPAnet. This network was a military project set up by the US ministry of Defense. The study for the network was launched in the middle of the Cold War. The network was intended to survive rocket attacks from Russia. In the conventional telephone network a rocket hit would cut out the entire network. The ARPAnet should keep working for the part not hit. This meant another network architecture and protocol, in which the network was not dependent on a central computer. By 1969 the network became operational and started to work with computers of some Californian universities.

But at the same time an important development took place for telephone networks. So far networks had been designed for voice traffic. In fact the networks were not very suited for data transport. So a new environment for transmitting data had to be developed. In the late sixties packet switching was introduced. With this method information from databases or e-mail messages were chopped into small data packages, which could be sent to a destination through various routes and even networks. In the fall of 1971 the American engineer Roy Tomlinson transmitted a first electronic message. When asked after 30 years what the content of the message was, he answered that it most likely was the first line of Lincoln’s Gettysburg Address; the message was written in capitals. (Tomlinson happens to be the inventor of the @ sign in the e-mail addresses). The special packet switching networks, which were set up on the basis of this technology, formed the basis for the new online industry with information and e-mail services; mind you, at first information retrieval and e-mail were offered separately by separate companies.

Online
In the sixties the computers, mainframes and mini-computers, could be connected with each other by modem (modulator en demodulator), a device as large as a VCR. It was originally intended to modulate the data signals for the voice network on the one side and demodulate the data signals at the other side. But this was not easy due to the incompatibility of operation systems, hardware and telephone lines. Special lines were leased for the data traffic in order to prevent a bit from falling over.
Technically it was already possible in the sixties to exchange data online. Yet it did not really pick up as it was easier to exchange punch cards and magnetic reels. But online got a political impulse, when John F. Kennedy became president in 1961 and promised the American people that the USA would be the first country to put a man on the moon. This promise accelerated the development of online. Space companies like NASA and space ship builders like Lockheed has to find new ways to make scientific, technical and medical literature accessible. On January 10, 1963 president Kennedy released a press statement, announcing the coordination of scientific, technical and medical information. A committee was formed with Alvin M. Weinberg as chair, which produced the report Science, government, and information: the responsibilities of the technical community and the government in the transfer of information (1963) in three months.
At NASA and a scientific institute like the National Library of Medicine abstracts of scientific, technical and medical articles were already produced for print magazines and the punch tapes for the text files were fed into the computer. The abstracts were indexed and made available for desk research. An employee of the institute could retrieve information through the internal network by keying a key word or a combination of keywords, upon which the index lists were checked for a match. The result was sent to a screen or printer. The step from the internal network to the external network by which researchers could use the abstract databases for their automated literature search was only logical.

Technical ingredients
For the online service the following technical ingredients were required:
1. Host computer (server we would say these days): a central computer with a communication program and an information retrieval program. These computers were mainframes or mini-computers. They worked on the principle of time-sharing, which made it possible for many searchers to start search actions and retrieve information. Although it looked like all searchers were online and actively searching together, they all got individually time assigned to get access to a file.
2. Telephone network/data network. Online information retrieval started over the Plain Old Telephone network (POT), but switched from 1971 onwards to data networks with packet switching.
3. Modem (moduleren en demoduleren). a device (de)modulating zeros and ones to sounds in the POT network and vice versa or transferring bits and bytes from A to B.
4. Terminal: a terminal had several forms as no PCs existed yet. So a terminal could be a key board either with a fixed or a separate printer. It could also be a screen and a keyboard, but without any intelligence; what came on the screen could be sent on to the printer. And there was a portable terminal consisting of a keyboard and a thermal printer.
5. Protocol: the communication protocol between the central computer and the terminal consisted of ASCII (American Standard Code of Information Interchange), a set of 128 agreed letters, figures and symbols. It was no presentation protocol telling how a screen should be made up, but it was purely intended for texts. On monochrome terminals screen, which were grey, green or amber, no pages with texts were shown, but lines of 80 columns which rolled from top to bottom.


Blog Posting Number: 1215

Tags: , , , ,