Chapter 2

Another factor is the development of electronic commerce. “Although a multilingual web may be desirable on moral and ethical grounds, such high ideals are not enough to make it other than a reality on a small-scale. As well as the appropriate technology being available so that the non-English speaker can go, there is the impact of 'electronic commerce' as a major force that may make multilingualism the most natural path for cyberspace. Sellers of products and services in the virtual global marketplace into which the internet is developing must be prepared to deal with a virtual world that is just as multilingual as the physical world. If they want to be successful, they had better make sure they are speaking the languages of their customers!"

Founder of Euro-Marketing Associates and its virtual branch Global Reach, Bill Dunlap championed the assets of e-commerce in Europe among his fellow compatriots in the U.S., promoting the internationalization and localization of their websites. He wrote in December 1998: "There are so few people in the U.S. interested in communicating in many languages — most Americans are still under the delusion that the rest of the world speaks English. However, in Europe, the countries are small enough so that an international perspective has been necessary for centuries."

Peter Raggett, deputy-head (and then head) of the Central Library of OECD (Organization for Economic Cooperation and Development), wrote in August 1999: "I think it is incumbent on European organizations and businesses to try and offer websites in three or four languages if resources permit. In this age of globalization and electronic commerce, businesses are finding that they are doing business across many countries. Allowing French, German, Japanese speakers to easily read one's website as well as English speakers will give a business a competitive edge in the domain of electronic trading."

As the internet quickly spread worldwide, companies needed to offer bilingual, trilingual, even plurilingual websites to reach as large an audience as possible, while adapting their content to a given audience, either a country or a linguistic community. Thus the need to internationalize and localize websites, which became a major trend in the late 1990s and early 2000s, with English- language companies and organizations setting up plurilingual websites, in English and other languages, and non-English-language companies and organizations setting up websites in their own language(s) and English.

1999 > BILINGUAL DICTIONARIES IN WORDREFERENCE.COM

[Summary] Michael Kellogg created WordReference.com in 1999. He wrote much later on his website: "I started this site in 1999 in an effort to provide free online bilingual dictionaries and tools to the world. The site has grown gradually ever since to become one of the most- used online dictionaries, and the top online dictionary for its language pairs of English-Spanish, English-French, English-Italian, Spanish-French, and Spanish-Portuguese. It is consistently ranked in the top 500 most-visited websites in the world. I am proud of my history of innovation with dictionaries on the internet. Many of the features such as being able to click any word in a dictionary entry were first implemented by me.” WordReference was also provided high-quality language forums, and lighter versions of some dictionaries for mobile devices.

***

Michael Kellogg created WordReference.com in 1999 to offer free online bilingual translation dictionaries.

Much later, Michael wrote on his website: "I started this site in 1999 in an effort to provide free online bilingual dictionaries and tools to the world. The site has grown gradually ever since to become one of the most-used online dictionaries, and the top online dictionary for its language pairs of English-Spanish, English-French, English-Italian, Spanish-French, and Spanish- Portuguese. It is consistently ranked in the top 500 most-visited websites in the world. I am proud of my history of innovation with dictionaries on the internet. Many of the features such as being able to click any word in a dictionary entry were first implemented by me.”

How was the idea behind his project? “The internet has done an incredible job of bringing the world together in the last few years. Of course, one of the greatest barriers has been language. Much of the content is in English and many, many users are reading English-language webpages as a second language. I know from my own experiences with Spanish-language websites that many readers probably understand much of what they are reading, but not every single word”, thus the need for a website offering free online bilingual translation dictionaries.

In 2010, WordReference has also offered a monolingual dictionary in English as well as dictionaries from English to other languages (Arabic, Chinese, Czech, Greek, Japanese, Korean, Polish, Portuguese, Romanian, Turkish), and vice versa. For the Spanish language, there was a monolingual dictionary, a dictionary of synonyms, a Spanish-French dictionary and a Spanish-Portuguese dictionary. Conjugation tables were available for French, Italian and Spanish. Monolingual dictionaries were available for German and Russian.

WordReference Mini was a miniature version of the site to be embedded into other sites, for example sites teaching languages online. A mobile device version was available for dictionaries from English to French, Italian and Spanish, and vice versa, with other language pairs to come.

As stated by Michael Kellogg: “Today [in 2011], I have three main goals with this website. First, continue to create free online bilingual dictionaries for English to many other languages. I strive to offer translations for "all" English words, terms, idioms, sayings, etc. Second, provide the world's best language forums; and third, continue to innovate to produce the best website and tools for the world.”

1999 > THE INTERNET, A MANDATORY TOOL FOR TRANSLATORS

[Summary] The internet became a mandatory tool for translators as “a vital and endless source of information”, as stated by Marcel Grangier, the head of the French Section of Central Linguistic Services, which means he was in charge of organizing translation matters into French for the linguistic services of the Swiss government. He explained in January 1999: “To work without the internet is simply impossible now. Apart from all the tools used (email, the electronic press, services for translators), the internet is for us a vital and endless source of information in what I'd call the 'non-structured sector' of the web. For example, when the answer to a translation problem can't be found on websites presenting information in an organized way, in most cases search engines allow us to find the missing link somewhere on the network.” His services also offered an online directory called “Dictionnaires Électroniques” (Electronic Dictionaries) with links to most quality dictionaries available for free on the web.

***

The internet became a mandatory tool for translators as “a vital and endless source of information”, as stated by Marcel Grangier, the head of the French Section of Central Linguistic Services.

Marcel Grangier was in charge of organizing translation matters into French for the linguistic services of the Swiss government. He explained in January 1999: “To work without the internet is simply impossible now. Apart from all the tools used (email, the electronic press, services for translators), the internet is for us a vital and endless source of information in what I'd call the 'non-structured sector' of the web. For example, when the answer to a translation problem can't be found on websites presenting information in an organized way, in most cases search engines allow us to find the missing link somewhere on the network.

Our website was first conceived as an intranet service for translators in Switzerland, who often deal with the same kind of material as the Federal government's translators. Some parts of it are useful to any translators, wherever they are. The section "Dictionnaires Électroniques" [Electronic Dictionaries] is only one section of the website. Other sections deal with administration, law, the French language, and general information. The site also hosts the pages of the Conference of Translation Services of European States (COTSOES).”

"Dictionnaires Électroniques" is a extensive directory of free dictionaries available online, with five main sections: abbreviations and acronyms, monolingual dictionaries, bilingual dictionaries, multilingual dictionaries, and geographical information. The index could also be searched by keywords. It was later transferred on the new website of COTSOES.

According to Marcel Grangier, “we can see multilingualism on the internet as a happy and irreversible inevitability. So we have to laugh at the doomsayers who only complain about the supremacy of English. Such supremacy is not wrong in itself, because it is mainly based on statistics (more PCs per inhabitant, more people speaking English, etc.). The answer is not to 'fight' English, much less whine about it, but to build more sites in other languages. As a translation service, we also recommend that websites be multilingual. (…) The increasing number of languages on the internet is inevitable and can only boost multicultural exchanges. For this to happen in the best possible circumstances, we still need to develop tools to improve compatibility. Fully coping with accents and other characters is only one example of what can be done."

Maria Victoria Marinetti was a translator from French to Spanish living near Geneva, Switzerland, with a doctorate in engineering from Mexico. She wrote in August 1999: “I have access to a large number of global information, which is very interesting for me. I can also regularly send or receive files back and forth. The internet allows be to receive or send general and technical translations from French into Spanish, and vice versa, and to correct texts in Spanish. In the technical or chemical fields, I offer a technical assistance, as well as information about exporting high-tech equipment to Mexico or to other Latin American countries.”

As for multilingualism, "it is very important to be able to communicate in various languages. I would even say this is mandatory, because the information given on the internet is meant for the whole world, so why wouldn't we get this information in our language or in the language we wish? Worldwide information, but no broad choice for languages, this would be quite a contradiction, wouldn't it?"

In 2000, the internet was multilingual, with half of its users having a mother tongue other than English, but the language barrier was far from gone. If any language was now available on the web, many users were monolingual, and even plurilingual users couldn’t read all languages. Bridges were needed between language communities to improve the flow of information in other languages, including by offering better translation software and by offering tools for all languages, and not only the dominant ones.

1999 > THE NEED FOR BILINGUAL INFORMATION ONLINE

[Summary] With the web spreading worldwide, bilingual information online became mandatory, as stated by Henk Slettenhaar, a professor in communication technologies at Webster University, Geneva, Switzerland, and a trilingual European. Henk spent his childhood in Holland, has taught his courses in English and has lived in neighboring France. He wrote in August 1999: "There are two main categories of websites in my opinion. The first one is the global outreach for business and information. Here the language is definitely English first, with local versions where appropriate. The second one is local information of all kinds in the most remote places. If the information is meant for people of an ethnic and/or language group, it should be in that language first, with perhaps a summary in English. We have seen lately how important these local websites are — in Kosovo and Turkey, to mention just the most recent ones. People were able to get information about their relatives through these sites."

***

With the web spreading worldwide, bilingual information online became mandatory, as stated by Henk Slettenhaar, a professor in communication technologies at Webster University, Geneva, Switzerland, and a trilingual European.

Henk spent his childhood in Holland, has taught his courses in English and has lived in neighboring France. He wrote in December 1998: "I see multilingualism as a very important issue. Local communities that are on the web should principally use the local language for their information. If they want to present it to the world community as well, it should be in English too. I see a real need for bilingual websites. I am delighted there are so many offerings in the original language now. I much prefer to read the original with difficulty than getting a bad translation."

Henk added in August 1999: "There are two main categories of websites in my opinion. The first one is the global outreach for business and information. Here the language is definitely English first, with local versions where appropriate. The second one is local information of all kinds in the most remote places. If the information is meant for people of an ethnic and/or language group, it should be in that language first, with perhaps a summary in English. We have seen lately how important these local websites are — in Kosovo and Turkey, to mention just the most recent ones. People were able to get information about their relatives through these sites."

Geoffrey Kingscott, managing director of Praetorius, a language consultancy in applied languages, wrote in September 1998: "Because the salient characteristics of the web are the multiplicity of site generators and the cheapness of message generation, as the web matures it will in fact promote multilingualism. The fact that the web originated in the USA means that it is still predominantly in English but this is only a temporary phenomenon. If I may explain this further, when we relied on the print and audiovisual (film, television, radio, video, cassettes) media, we had to depend on the information or entertainment we wanted to receive being brought to us by agents (publishers, television and radio stations, cassette and video producers) who have to subsist in a commercial world or — as in the case of public service broadcasting — under severe budgetary restraints. That means that the size of the customer-base is all- important, and determines the degree to which languages other than the ubiquitous English can be accommodated. These constraints disappear with the web. To give only a minor example from our own experience, we publish the print version of Language Today [a magazine for linguists] only in English, the common denominator of our readers. When we use an article which was originally in a language other than English, or report an interview which was conducted in a language other than English, we translate it into English and publish only the English version. This is because the number of pages we can print is constrained, governed by our customer-base (advertisers and subscribers). But for our web edition we also give the original version."

Steven Krauwer, coordinator of ELSNET (European Network of Excellence in Human Language Technologies), explained in September 1998: "As a European citizen I think that multilingualism on the web is absolutely essential, as in the long run I don't think that it is a healthy situation when only those who have a reasonable command of English can fully exploit the benefits of the web. As a researcher (specialized in machine translation) I see multilingualism as a major challenge: how can we ensure that all information on the web is accessible to everybody, irrespective of language differences."

What practical solutions would he suggest? He answered in August 1999: "At the author end: better education of web authors to use combinations of modalities to make communication more effective across language barriers (and not just for cosmetic reasons). At the server end: more translation facilities à la AltaVista (quality not impressive, but always better than nothing). At the browser end: more integrated translation facilities (especially for the smaller languages), and more quick integrated dictionary lookup facilities."

Bruno Didier, webmaster of the Pasteur Institute’s library, wrote in August 1999: “The internet doesn't belong to any one nation or language. It is a vehicle for culture, and the first vector of culture is language. The more languages there are on the net, the more cultures will be represented there. I don't think we should give in to the kneejerk temptation to translate webpages into a largely universal language. Cultural exchanges will only be real if we are prepared to meet with the other culture in a genuine way. And this effort involves understanding the other culture's language. This is very idealistic of course. In practice, when I am monitoring, I curse Norwegian or Brazilian websites where there isn't any English.”

Alain Bron, a consultant in information systems and a writer, explained in January 1999: "Different languages will still be used for a long time to come and this is healthy for the right to be different. The risk is of course an invasion of one language to the detriment of others, and with it the risk of cultural standardization. I think online services will gradually emerge to get around this problem. First, translators will be able to translate and comment on texts by request, but mainly sites with a large audience will provide different language versions, just as the audiovisual industry does now."

In spring 2000, non-English-speaking users reached 50%. 78% of webpages were still in English in September 2000.

2000 > ONLINE ENCYCLOPEDIAS AND DICTIONARIES

[Summary] The first reference encyclopedias and dictionaries available online stemmed from print versions. Britannica.com was available in December 1999 as the web version of the 32-volume Encyclopaedia Britannica, first for free and then for a fee. The French-language WebEncyclo from Editions Atlas was available at the same time, for free, as well as the Encyclopaedia Universalis, for a fee. The first major online dictionaries also stemmed from print versions, for example the free Merriam-Webster Online launched in 1996, that included the Webster Dictionary, the Webster Thesaurus, and other tools. The French-language “Dictionnaire Universel Francophone en Ligne “ from Hachette was available for free in 1997. The online version of the 20-volume Oxford English Dictionary (OED) was available in March 2000 for a fee. Designed directly for the web, the Grand Dictionnaire Terminologique (GDT) was launched in September 2000 in Quebec as the largest free French-English terminology dictionary, and quickly praised by linguists worldwide.

***

The first reference encyclopedias and dictionaries available online stemmed from print versions.

# Encyclopedias

Britannica.com was launched in December 1999 as the digital equivalent of the 32 volumes of the 15th edition of the Encyclopaedia Britannica. The website was available for free, as a complement to the print and CD-ROM versions for sale, with a selection of articles from 70 magazines, a guide to the best websites, a selection of books, etc., all searchable through a single search engine. In September 2000, the site was among the top 100 websites in the world. In July 2001, the website, not free anymore, could be searched for a monthly or annual fee. In 2009, Britannica.com opened its website to external contributors, with registration required to write and edit articles.

Launched by Editions Atlas in December 1999 and stemming from a print encyclopedia, Webencyclo was the first main French-language online encyclopedia available for free. It was searchable by keyword, topic and media (i.e. maps, links, photos, illustrations). A call for papers invited specialists in a given topic to become external contributors and submit articles in a section called "Webencyclo Contributif". Later on, a free registration was required to use the online encyclopedia.

Launched at the same time, the website of the print French- language Encyclopedia Universalis included 28,000 articles by 4,000 contributors, available for an annual subscription fee, with a number of articles available for free.

# Dictionaries

Merriam-Webster, a well-known publisher of dictionaries, launched in 1996 the website "Merriam-Webster Online: The Language Center" to give free access to online resources stemming from several print reference works: Webster Dictionary, Webster Thesaurus, Webster's Third (a lexical landmark), Guide to International Business Communications, Vocabulary Builder (with interactive vocabulary quizzes), and the Barnhart Dictionary Companion (hot new words). The website’s goal was also to help track down definitions, spellings, pronunciations, synonyms, vocabulary exercises, and other key facts about words and language.

The "Dictionnaire Universel Francophone en Ligne" (Universal French-Language Online Dictionary) was the web version of the "Dictionnaire Universel Francophone", published by Hachette in partnership with AUPELF-UREF (which later became AUF: Agence Universitaire de la Francophonie - University Agency of Francophony). The dictionary included not only standard French but also the French-language words and expressions used worldwide. French was spoken by 500 million people in 50 countries. As a side remark, English and French are the only official and/or cultural languages widely spread on five continents.

The online version (for a subscription fee) of the 20-volume Oxford English Dictionary (OED) was launched in March 2000 by Oxford University Press (OUP), followed by a quarterly update with around 1,000 new or revised entries. Two years later, Oxford University Press launched Oxford Reference Online (ORO), a comprehensive encyclopedia designed directly for the web and also available for a subscription fee. Its 60,000 webpages and one million entries could represent the equivalent of 100 print encyclopedias.

# The GDT from Quebec

With 3 million terms related to industry, science and commerce, the GDT (Grand Dictionnaire Terminologique - Main Terminological Dictionary) was the largest French-English online terminology dictionary. The GDT was designed directly for the web by OQLF (Office Québécois de la Langue Française - Quebecois Office of the French Language) and launched in September 2000 as a free service. The GDT was a technological challenge, and the result of a partnership between OQLF, author of the dictionary, and Semantix, a company specialized in linguistic software. The GDT had 1.3 million individual visits during the first month, with peaks of 60,000 visits per day, which certainly contributed to better translations. The database was then maintained by Convera Canada, with 3.5 million visits per month in February 2003. A revamped version of the GDT went online in March 2003, with the database maintained by OQLF itself, and the addition of Latin as a third language.

2000 > THE WEB PORTAL YOURDICTIONARY.COM

[Summary] Robert Beard, a language teacher at Bucknell University, in Lewisburg, Pennsylvania (USA), co-founded yourDictionary.com in February 2000 as a follow-up of his first website, A Web of Online Dictionaries (included in the new one), launched in 1995 as a directory of online dictionaries (with 800 links in fall 1998) and other linguistic resources such as thesauri, vocabularies, glossaries, grammars and language textbooks. yourDictionary.com included 1,800 dictionaries in 250 languages in September 2003, and 2,500 dictionaries in 300 languages in April 2007. As a portal for all languages without any exception, the site also offered a section for endangered languages, called the Endangered Language Repository.

***

Five years before co-founding yourDictionary.com in February 2000, as the portal for all languages without any exception, Robert Beard created the website A Web of Online Dictionaries (WOD) in 1995.

Robert Beard was a language teacher at Bucknell University, in Lewisburg, Pennsylvania (USA). In September 1998, his website provided an index of 800 online dictionaries in 150 languages, as well as sections for multilingual dictionaries, specialized English dictionaries, thesauri and other vocabulary aids, language identifiers and guessers, an index of dictionary indices, the “Web of Online Grammars”, and the “Web of Linguistic Fun”, i.e. linguistics for non-specialists.

Robert Beard wrote in September 1998: "There was an initial fear that the web posed a threat to multilingualism on the web, since HTML and other programming languages are based on English and since there are simply more websites in English than any other language. However, my websites indicate that multilingualism is very much alive and the web may, in fact, serve as a vehicle for preserving many endangered languages. I now have links to dictionaries in 150 languages and grammars of 65 languages. Moreover, the new attention paid by browser developers to the different languages of the world will encourage even more websites in different languages."

Fifteen months later, Robert Beard included his website into a larger project, yourDictionary.com, that he co-founded in early 2000.

He wrote in January 2000: "The new website is an index of 1,200+ dictionaries in more than 200 languages. Besides the WOD, the new website includes a word-of-the-day-feature, word games, a language chat room, the old 'Web of Online Grammars' (now expanded to include additional language resources), the 'Web of Linguistic Fun', multilingual dictionaries; specialized English dictionaries; thesauri and other vocabulary aids; language identifiers and guessers, and other features; dictionary indices. yourDictionary.com will hopefully be the premiere language portal and the largest language resource site on the web. It is now actively acquiring dictionaries and grammars of all languages with a particular focus on endangered languages. It is overseen by a blue ribbon panel of linguistic experts from all over the world. (…)

Indeed, yourDictionary.com has lots of new ideas. We plan to work with the Endangered Language Fund in the U.S. and Britain to raise money for the Foundation's work and publish the results on our site. We will have language chat rooms and bulletin boards. There will be language games designed to entertain and teach fundamentals of linguistics. The Linguistic Fun page will become an online journal for short, interesting, yes, even entertaining, pieces on language that are based on sound linguistics by experts from all over the world."

As the portal for all languages without any exception, yourDictionary.com offered a section for endangered languages called the Endangered Language Repository.

As explained by Robert Beard: "Languages that are endangered are primarily languages without writing systems at all (only 1/3 of the world's 6,000+ languages have writing systems). I still do not see the web contributing to the loss of language identity and still suspect it may, in the long run, contribute to strengthening it. More and more Native Americans, for example, are contacting linguists, asking them to write grammars of their language and help them put up dictionaries. For these people, the web is an affordable boon for cultural expression."

How about the future of the web? "The web will be an encyclopedia of the world by the world for the world. There will be no information or knowledge that anyone needs that will not be available. The major hindrance to international and interpersonal understanding, personal and institutional enhancement, will be removed. It would take a wilder imagination than mine to predict the effect of this development on the nature of humankind.”

2000 > PROJECT GUTENBERG AND LANGUAGES

[Summary] Project Gutenberg is a visionary project launched by Michael Hart in July 1971 to create free electronic versions of literary works and disseminate them worldwide. In 2010, Project Gutenberg offered more than 33,000 high-quality ebooks being downloaded by the tens of thousands every day, and websites in the United States, Australia, Europe and Canada, with 40 mirror sites worldwide. Project Gutenberg mainly offers ebooks in English, but multilingualism has been one of its priorities since the late 1990s. French is the second language of the project. There were ebooks in 60 languages in December 2010, thanks to the patient work of Distributed Proofreaders, a website created in 2000 to share the proofreading of ebooks between hundreds of volunteers in many countries.

***

Project Gutenberg is a visionary project launched in July 1971 by Michael Hart to create free electronic versions of literary works and disseminate them worldwide. In the 15th century, Gutenberg allowed anyone to have print books for a small cost. In the 21th century, Project Gutenberg would allow anyone to have a digital library at no cost.

Michael worked from Illinois, typing in books from public domain, for example the Bible and the complete works of Shakespeare, first alone, then with the help of a few volunteers.

His project got a major boost with the invention of the web in 1990. 95% of internet users were native English speakers in the mid-1990s, so most books were in English.

Project Gutenberg was also inspiring other digital libraries inEurope. Projekt Runeberg was launched in Sweden in 1994 todigitize Nordic (Scandinavian) literature from public domain.Projekt Gutenberg-DE was launched in Germany in 1994 to digitizeGerman literature from public domain.

French was the second language of Project Gutenberg, and still is now. The first ebooks released in French were six works by Stendhal and two works by Jules Verne, all released in early 1997. Three novels by Jules Verne were already available in English in 1994. Since then, Jules Verne has always stayed on the top list of the most downloaded authors.

In October 1997, Michael Hart wrote about producing more works in other languages than English in the Project Gutenberg newsletter. In early 1998, on top of ten French ebooks, there were a few ebooks in German, Italian, Spanish and Latin. Released in May 1999, eBook #2000 was “Don Quijote” (1605), by Cervantes, in Spanish, its original language. In July 1999, Michael wrote in an email interview: "I am publishing in one new language per month right now, and will continue as long as possible."

The project got a new boost with the launching of Distributed Proofreaders, a website created in October 2000 by Charles Franks to share the proofreading of ebooks between hundreds of volunteers living in many countries.

Released in April 2002, eBook #5000 was “The Notebooks of Leonardo da Vinci” (written in the early 16th century), as an English translation from Italian, its original language. Since its release, it has regularly stayed in the top 100 downloaded ebooks.

There were works in 25 languages in early 2004, in 42 languages in July 2005, including Sanskrit and the Mayan languages, and in 59 languages in October 2010. The ten main languages were English (with 28,441 ebooks on 7 October 2010), French (1,659 ebooks), German (709 ebooks), Finnish (536 ebooks), Dutch (496 ebooks), Portuguese (473 ebooks), Chinese (405 ebooks), Spanish (295 ebooks), Italian (250 ebooks), and Greek (101 ebooks). The next languages were Latin, Esperanto, Swedish and Tagalog.

When machine translation will be judged 99% satisfactory, we may be able to read literary classics in a choice of many languages. The machine translated ebooks won't compete with the work of literary translators and their labor of love during days and months if not years, but they will allow readers to get the gist of some literary works that have never been translated so far, or only translated in a few languages for commercial reasons.

The output of translation software could then be proofread by human translators, in a similar way the output of OCR software is proofread by the volunteers of Distributed Proofreaders. So, may be, we will see the creation of Distributed Translators one day, as a partner or sister project of Distributed Proofreaders and Project Gutenberg.

2001 > WIKIPEDIA, A COLLABORATIVE ENCYCLOPEDIA

[Summary] Wikipedia was launched in January 2001 by Jimmy Wales and Larry Sanger (Larry resigned later on) as a global free collaborative online encyclopedia, financed by donations, with no advertising. Its website is a wiki, which means that anyone can write, edit, correct and improve information throughout the encyclopedia, with people contributing under a pseudonym. The articles stay the property of their authors, and can be freely used according to Creative Commons or GFDL (GNU Free Documentation License). Wikipedia quickly became the largest reference website. It was in the top ten websites in December 2006, and in the top five websites in 2008. In May 2007, Wikipedia had 7 million articles in 192 languages, including 1.8 million articles in English, 589,000 articles in German, 500,000 articles in French, 260,000 articles in Portuguese, and 236,000 articles in Spanish. Wikipedia celebrated its tenth anniversary in January 2011 with 17 million articles in 270 languages et 400 million individual visits per month for all websites.

***

Wikipedia was launched in January 2001 by Jimmy Wales and Larry Sanger (Larry resigned later on) as a global free collaborative online encyclopedia.

Wikipedia was financed by donations, with no advertising. Its website is a wiki, which means that anyone can write, edit, correct and improve information throughout the encyclopedia, with people contributing under a pseudonym. The articles stay the property of their authors, and can be freely used according to Creative Commons or GFDL (GNU Free Documentation License).

Wikipedia is hosted by the Wikimedia Foundation, founded in June 2003, which has run a number of other projects, beginning with Wiktionary (launched in December 2002) and Wikibooks (launched in June 2003), followed by Wikiquote, Wikisource (texts from public domain), Wikimedia Commons (multimedia), Wikispecies (animals and plants), Wikinews and Wikiversity (textbooks).

Wikipedia quickly became the largest reference website, with thousands of people contributing worldwide. In December 2004, Wikipedia had 1.3 million articles by 13,000 contributors in 100 languages. In December 2006, Wikipedia was among the top ten sites on the web, with 6 million articles. In May 2007, Wikipedia had 7 million articles in 192 languages, including 1.8 million articles in English, 589,000 articles in German, 500,000 articles in French, 260,000 articles in Portuguese, and 236,000 articles in Spanish. In 2008, Wikipedia was in the top five websites. In September 2010, Wikipedia had 14 million articles in 272 languages, including 3.4 million articles in English, 1.1 million articles in German and 1 million articles in French. Wikipedia celebrated its tenth anniversary in January 2011 with 17 million articles in 270 languages et 400 million individual visits per month for all websites.

Wikipedia also inspired many other projects over the years, for example Citizendium, launched in 2007 as a pilot project to build a new encyclopedia.

Citizendium, an acronym for “The Citizen’s Compendium”, was launched in March 2007 at the initiative of Larry Sanger, who co- founded Wikipedia with Jimmy Wales in January 2001, but resigned later on over policy and content quality issues, as well as the use of anonymous pseudonyms.

Citizendium is a wiki project open to public collaboration, but combining "public participation with gentle expert guidance". The project is experts-led, not experts-only. Contributors use their own names, and they are guided by expert editors. As explained by Larry in his essay "Toward a New Compendium of Knowledge", posted in September 2006 and updated in March 2007: "Editors will be able to make content decisions in their areas of specialization, but otherwise working shoulder-to-shoulder with ordinary authors." There are also constables who make sure the rules are respected.

There were 1,100 high-quality articles, 820 authors, and 180 editors in March 2007, 11,800 articles in August 2009, and 15,000 articles in September 2010. Citizendium wants to act as a prototype for upcoming large scale knowledge-building projects that would deliver reliable reference, scholarly and educational content.

2001 > UNL, A DIGITAL METALANGUAGE PROJECT

[Summary] The UNDL Foundation (UNDL: Universal Networking Digital Language) was founded in January 2001 to develop and promote the UNL (Universal Networking Language) project. The UNL project was launched in 1996 as a major digital metalanguage project by the Institute of Advanced Studies (IAS) of the United Nations University (UNU) in Tokyo, Japan. As explained in 1998 on the bilingual English-Japanese website: "UNL is a language that — with its companion 'enconverter' and 'deconverter' software — enables communication among peoples of differing native languages. It will reside, as a plug-in for popular web browsers, on the internet, and will be compatible with standard network servers." At the time, 120 researchers worldwide were working on a multilingual project in 16 languages (Arabic, Brazilian, Chinese, English, French, German, Hindi, Indonesian, Italian, Japanese, Latvian, Mongolian, Russian, Spanish, Swahili, Thai).

***

The UNDL Foundation (UNDL: Universal Networking Digital Language) was founded in January 2001 to develop and promote the UNL (Universal Networking Language) project.

The UNL project was launched in 1996 as a major digital metalanguage project by the Institute of Advanced Studies (IAS) of the United Nations University (UNU) in Tokyo, Japan.

As explained in 1998 on the bilingual English-Japanese website: "UNL is a language that — with its companion 'enconverter' and 'deconverter' software — enables communication among peoples of differing native languages. It will reside, as a plug-in for popular web browsers, on the internet, and will be compatible with standard network servers. The technology will be shared among the member states of the United Nations. Any person with access to the internet will be able to 'enconvert' text from any native language of a member state into UNL. Just as easily, any UNL text can be 'deconverted' from UNL into native languages. United Nations University's UNL Center will work with its partners to create and promote the UNL software, which will be compatible with popular network servers and computing platforms."

At the time, 120 researchers worldwide were working on a multilingual project in 16 languages (Arabic, Brazilian, Chinese, English, French, German, Hindi, Indonesian, Italian, Japanese, Latvian, Mongolian, Russian, Spanish, Swahiki, Thai). After things worked with 16 languages, other UN languages would be included in 2000.

UNL was meant to become the HTML of linguistic content. Possible applications would be multilingual email, multilingual information, active dictionaries for reading foreign languages online, and machine translation for navigating the web and monitoring websites.

The project was also important from a political and cultural point of view, as the first project building up tools for all languages on the internet, i.e. main languages as well as minority languages.

The UNDL Foundation (UNDL: Universal Networking Digital Language) was founded in January 2001 to develop and promote the UNL project, and became a partner of the United Nations.

The definition of UNL has evolved over the years. According to the UNDL Foundation’s website in 2010: “UNL is a computer language that enables computers to process information and knowledge. It is designed to replicate the functions of natural languages. Using UNL, people can describe all information and knowledge conveyed by natural languages for computers. As a result, computers can intercommunicate through UNL and process information and knowledge using UNL, thus providing people with a Linguistic Infrastructure (LI) in computers and the internet for distributing, receiving and understanding multilingual information. Such multilingual information can be accessed by natural languages through the UNL System. UNL, as a language for expressing information and knowledge described in natural languages, has all the components corresponding to that of a natural language.”

2001 > A MARKET FOR LANGUAGE TRANSLATION SOFTWARE

[Summary] The development of electronic commerce boosted language translation software, products and services targeting the general public, language professionals, and companies localizing their websites. The software, products and services were developed for example by Alis Technologies, Globalink, Lernout & Hauspie, Softissimo and IBM. In March 2001, IBM embarked on a growing translation market with a high-end professional product, the WebSphere Translation Server. The software could instantly translate webpages, emails and chats from/into several languages (Chinese, English, French, German, Italian, Japanese, Korean, Spanish). It could process 500 words per second and add terminology to the software. Computer-assisted translation (CAT) software were developed for professional translators, based on “translation memory” with terminology processing in real time, for example Wordfast, created in 1999 by Yves Champollion. Worldfast could be used on any platform (Windows, Mac, Linux), and was compatible with the software of other key players like IBM and SDL Trados.

***

The development of electronic commerce boosted language translation software, products and services targeting the general public, language professionals, and companies localizing their websites.

The software, products and services were developed for example byAlis Technologies, Globalink, Lernout & Hauspie, Softissimo andIBM.

In March 2001, IBM embarked on a growing translation market with a high-end professional product, the WebSphere Translation Server. The software could instantly translate webpages, emails and chats from/into several languages (Chinese, English, French, German, Italian, Japanese, Korean, Spanish). It could process 500 words per second and add terminology to the software.

Machine translation can be defined as the automated process of translating a text from one language to another language. MT analyzes the text in the source language and automatically generates the corresponding text in the target language. With the lack of human intervention during the translation process, machine translation differs from computer-assisted translation (CAT), based on interaction between the translator and the computer.

Computer-assisted translation (CAT) software were developed for professional translators, based on “translation memory” with terminology processing in real time, for example Wordfast, created in 1999 by Yves Champollion. Worldfast was compatible with the software of other key players like IBM and SDL Trados. Available for any platform (Windows, Mac, Linux), Wordfast had 14,000 customers worldwide in 2010, including the United Nations, Coca- Cola and Sony.

According to Tim McKenna, a writer and philosopher interviewed in October 2000: "When software gets good enough for people to chat or talk on the web in real time in different languages, then we will see a whole new world appear before us. Scientists, political activists, businesses and many more groups will be able to communicate immediately without having to go through mediators or translators."

A further step could be “transcultural, transnational transparency”, as stated in September 1998 by Randy Hobler, a consultant in internet marketing of translation software and services: "We are rapidly reaching the point where highly accurate machine translation of text and speech will be so common as to be embedded in computer platforms, and even in chips in various ways. At that point, and as the growth of the web slows, the accuracy of language translation hits 98% plus, and the saturation of language pairs has covered the vast majority of the market, language transparency (any-language-to-any-language communication) will be too limiting a vision for those selling this technology. The next development will be 'transcultural, transnational transparency', in which other aspects of human communication, commerce and transactions beyond language alone will come into play. For example, gesture has meaning, facial movement has meaning and this varies among societies. (…)

There are thousands of ways in which cultures and countries differ, and most of these are computerizable to change as one goes from one culture to the other. They include laws, customs, business practices, ethics, currency conversions, clothing size differences, metric versus English system differences, etc. Enterprising companies will be capturing and programming these differences and selling products and services to help the peoples of the world communicate better. Once this kind of thing is widespread, it will truly contribute to international understanding."

2004 > THE WEB 2.0, COMMUNITY AND SHARING

[Summary] The term "web 2.0" was invented in 2004 by Tim O'Reilly, founder of O'Reilly Media, a publisher of computer books, as a title for a series of conferences he was organizing. The web 2.0 was based on community and sharing, with a wealth of websites whose content was supplied by users, such as blogs, wikis, social networks and collaborative encyclopedias. Wikipedia, Facebook and Twitter, of course, but also tens of thousands of others. The web 2.0 may begin to fulfill the dream of Tim Berners-Lee, who invented the web in 1990, and wrote in an essay dated April 1998: "The dream behind the web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished. ("The World Wide Web: A very short personal history", available on his webpage on the W3C website)

***

The term "web 2.0" was invented in 2004 by Tim O'Reilly, founder of O'Reilly Media, a publisher of computer books, as a title for a series of conferences he was organizing.

The web 2.0 was based on community and sharing, with a wealth of websites whose content was supplied by users, such as blogs, wikis, social networks and collaborative encyclopedias. Wikipedia, Facebook and Twitter, of course, but also tens of thousands of others.

The web 2.0 may begin to fulfill the dream of Tim Berners-Lee, who invented the web in 1990, and wrote in an essay dated April 1998: "The dream behind the web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished.” ("The World Wide Web: A very short personal history", available on his webpage on the W3C website)

The first blog was launched in 1997. A blog is an online diary kept by a person or a group, usually in reverse chronological order, and can be updated every minute or once a month. There were 14 million blogs worldwide in July 2005, with 80,000 new blogs per day. According to Technorati, the first blog search engine, there were 65 million blogs in December 2006, with 175,000 new blogs per day. Some blogs are devoted to photos (photoblogs), music (audioblogs or podcasts), and videos (vlogs or videoblogs).

The wiki concept became quite popular in 2000. Deriving from the Hawaiian term "wiki" ("fast"), a wiki is a website allowing multiple users to collaborate online on the same project. Users can contribute to drafting content, editing it, improving it, and updating it. The software can be simple or more elaborate. A simple program handles text and hyperlinks. With a more elaborate program, one can embed images, charts, tables, etc. The most famous wiki is Wikipedia.

Facebook was founded in February 2004 by Mark Zuckerberg and his fellow students as a social network. Originally created for the students of Harvard University, it was then available to students from any university in the U.S. before being open to anyone worldwide in September 2006, to connect with relatives, friends and strangers. Facebook was the second most visited website after Google, with 500 million users in June 2010, while sparking debates on privacy issues.

Founded in 2006 by Jack Dorsey and Biz Stone, Twitter is a social networking and micro-blogging tool to send free short messages of 140 characters maximum, called tweets, via the internet, IM or SMS. Sometimes described as the SMS of the internet, Twitter gained worldwide popularity, with 106 million users in April 2010, and 300,000 new users per day. As for tweets, there were 5,000 per day in 2007, 300,000 in 2008, 2.5 million in 2009, 50 million in January 2010, and 55 million in April 2010, with the archiving of public tweets by the Library of Congress as a reflection of the trends of our time.

We now try to fulfill the second part of Tim Berners-Lee’s dream, according to his essay dated April 1998: “There was a second part of the dream, too, dependent on the web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialize. That was that once the state of our interactions was online, we could then use computers to help us analyze it, make sense of what we are doing, where we individually fit in, and how we can better work together."

2007 > THE ISO 639-3 STANDARD TO IDENTIFY LANGUAGES

[Summary] The first standard to identify languages was ISO 639-1, adopted by the International Organization for Standardization (ISO) in 1988 as a set of two-letter identifiers. The ISO 639-2 standard followed in 1998 as a set of three-letter codes identifying 400 languages. Published by SIL International, the Ethnologue, an encyclopedic catalog of living languages, had also developed its own three-letter codes in its database since 1971, with their inclusion in the publication itself since 1984 (10th edition). ISO 639-2 quickly became outdated. In 2002, at the invitation of the International Organization for Standardization, SIL International prepared a new standard that reconciled the complete set of identifiers used in the Ethnologue with the identifiers already in use in ISO 639-2, as well as identifiers developed by the Linguist List to handle ancient and constructed languages. Published in 2007, the ISO 639-3 standard provided three-letter codes for identifying 7,589 languages. SIL International was named the registration authority for the inventory of language identifiers.

***

Published in 2007, the ISO 639-3 standard provided three-letter codes for identifying 7,589 languages.

The first standard to identify languages was ISO 639-1, adopted by the International Organization for Standardization (ISO) in 1988 as a set of two-letter language identifiers.

The ISO 639-2 standard followed in 1998 as a set of three-letter codes identifying 400 languages. The standard was a convergence of ISO 639-1 and the ANSI Z39.53 standard (ANSI: American National Standards Institute). The ANSI standard corresponded to the MARC (Machine Readable Cataloging) language codes, a set of three- letter identifiers developed by the library community and adopted as an American National Standard in 1987.

Published by SIL International, the Ethnologue, an encyclopedic catalog of living languages, had also developed its own three- letter codes in its database since 1971, with the inclusion in the encyclopedia itself from the 10th edition (1984) onwards.

ISO 639-2 quickly became insufficient because of the small number of languages it could handle. In 2002, at the invitation of the International Organization for Standardization, SIL International prepared a new standard that reconciled the complete set of codes used in the Ethnologue with the codes already in use in ISO 639-2, as well as codes developed by the Linguist List — a main distribution list for linguists — to handle ancient and constructed languages.

Approved in 2006 and published in 2007, the ISO 639-3 standard provided three-letter codes for identifying 7,589 languages, with a list of languages as complete as possible, living and extinct, ancient and reconstructed, major and minor, and written and unwritten. SIL International was named the registration authority for the inventory of language identifiers, and administers the annual cycle for changes and updates.

2007 > GOOGLE TRANSLATE

[Summary] Launched by Google in October 2007, Google Translate is a free online language translation service that instantly translates a section of text, document or webpage into another language. Users paste texts in the web interface or supply an hyperlink. The automatic translations are produced by statistical analysis rather than traditional rule-based analysis. Prior to this date, Google used a Systran based translator like Babel Fish in Yahoo! As an automatic translation tool, Google Translate can help the reader understand the general content of a foreign language text, but doesn’t deliver accurate translations. In 2009, the text could be read by a speech program, with new languages added over the months. Released in June 2009, Google Translator Toolkit is a web service allowing (human) translators to edit the translations automatically generated by Google Translate. In January 2011, people could choose different translations for a word in Google Translate.

***

Launched by Google in October 2007, Google Translate is a free online language translation service that instantly translates a section of text, document or webpage into another language.

Users paste texts in the web interface or supply an hyperlink. The automatic translations are produced by statistical analysis rather than traditional rule-based analysis.

As an automatic translation tool, Google Translate can help the reader understand the general content of a foreign language text, but doesn’t deliver accurate translations.

Prior to this date, Google used a Systran based translator likeBabel Fish in Yahoo!, with several stages for the language options:

First stage: English to French, German, and Spanish, and viceversa.Second stage: English to Portuguese and Dutch, and vice versa.Third stage: English to Italian, and vice versa.Fourth stage: English to simplified Chinese, Japanese and Korean,and vice versa.Fifth stage (April 2006): English to Arabic, and vice versa.Sixth stage (December 2006): English to Russian, and vice versa.Seventh stage (February 2007): English to traditional Chinese, andsimplified Chinese to traditional Chinese, and vice versa.

Here were the first language options for Google’s translation system:

First stage (October 2007): All language pairs previouslyavailable were available in any language combination.Second stage: English to Hindi, and vice versa.Third stage (May 2008): Bulgarian, Croatian, Czech, Danish,Finnish, Greek, Norwegian, Polish, Romanian, Swedish, with anycombination.Fourth stage (September 2008): Catalan, Filipino, Hebrew,Indonesian, Latvian, Lithuanian, Serbian, Slovak, Slovene,Ukrainian, Vietnamese.Fifth stage (January 2009): Albanian, Estonian, Galician,Hungarian, Maltese, Thai, Turkish.Sixth stage (June 2009): Persian.Seventh stage (August 2009): Afrikaans, Belarussian, Icelandic,Irish, Macedonian, Malay, Swahili, Welsh, Yiddish.Eighth stage (January 2010): Haitian Creole.Ninth stage (May 2010): Armenian, Azeri, Basque, Georgian, Urdu.Tenth stage (October 2010): Latin.Etc.

A speech program was launched in 2009 to read the translated text, with new languages added over the months. In January 2011, people could choose different translations for a word in Google Translate.

Google Translator Toolkit is a web service allowing (human) translators to edit the translations automatically generated by Google Translate. Translators can also use shared translations, glossaries and translation memories. Starting in June 2009 with English as a source language and 47 target languages, Google Translator Toolkit supported 100,000 language pairs in May 2011, with 345 source languages into 345 target languages.

2009 > 6,909 LIVING LANGUAGES IN THE ETHNOLOGUE

[Summary] 6,909 living languages were cataloged in the 16th edition (2009) of “The Ethnologue: Languages of the World”, an encyclopedic reference work freely available on the web since 1996, with a print book for sale. As stated by Barbara Grimes, its editor from 1971 to 2000, the Ethnologue is “a catalog of the languages of the world, with information about where they are spoken, an estimate of the number of speakers, what language family they are in, alternate names, names of dialects, other socio-linguistic and demographic information, dates of published Bibles, a name index, a language family index, and language maps." A core team of researchers in Dallas, Texas, has been helped by thousands of linguists gathering and checking information worldwide. A new edition of the Ethnologue is published approximately every four years.

***

6,909 living languages were cataloged in the 16th edition (2009) of “The Ethnologue: Languages of the World”, an encyclopedic reference work freely available on the web since 1996, with a print book for sale.

As stated by Barbara Grimes, its editor from 1971 to 2000, the Ethnologue is “a catalog of the languages of the world, with information about where they are spoken, an estimate of the number of speakers, what language family they are in, alternate names, names of dialects, other socio-linguistic and demographic information, dates of published Bibles, a name index, a language family index, and language maps."

A core team of researchers in Dallas, Texas, has been helped by thousands of linguists gathering and checking information worldwide. A new edition of the Ethnologue is published approximately every four years.

The Ethnologue has been an active research project since 1950. It was founded by Richard Pittman as a catalog of minority languages, to share information on language development needs around the world with his colleagues at SIL International and other language researchers.

Richard Pittman was the editor of the 1st to 7th editions (1951- 1969).

Barbara Grimes was the editor of the 8th to 14th editions (1971- 2000). In 1971, information was expanded from primarily minority languages to encompass all known languages of the world. Between 1967 and 1973, Barbara completed an in-depth revision of the information on Africa, the Americas, the Pacific, and a few countries of Asia. During her years as editor, the number of identified languages grew from 4,493 to 6,809. The information recorded on each language expanded so that the published work more than tripled in size.

In 2000, Raymond Gordon Jr. became the third editor of theEthnologue and produced the 15th edition (2005).

In 2005, Paul Lewis became the editor, responsible for general oversight and research policy, with Conrad Hurd as managing editor, responsible for operations and database management, and Raymond Gordon as senior research editor, leading a team of regional and language-family focused research editors.

In the Introduction of the 15th edition (2009), the Ethnologue defines a language as such: "How one chooses to define a language depends on the purposes one has in identifying that language as distinct from another. Some base their definition on purely linguistic grounds. Others recognize that social, cultural, or political factors must also be taken into account. In addition, speakers themselves often have their own perspectives on what makes a particular language uniquely theirs. Those are frequently related to issues of heritage and identity much more than to the linguistic features of the language(s) in question."

As explained in the introduction, one feature of the database since its inception in 1971 has been a system of three-letter language identifiers (for example “fra” for French), that were included in the publication itself from the 10th edition (1984) onwards.

At the invitation of the International Organization for Standardization (ISO) in 2002, SIL International prepared a new standard that reconciled the complete set of codes used in the Ethnologue with the codes already in use in the ISO 639-2 standard (1998), that identified only 400 languages, as well as codes developed by Linguist List to handle ancient and constructed languages. Published in 2007, the ISO 639-3 standard provided three-letter codes for identifying nearly 7,500 languages. SIL International was named the registration authority for the inventory of language identifiers, and administers the annual cycle for changes and updates.

2010 > A UNESCO ATLAS FOR ENDANGERED LANGUAGES

[Summary] In 2010, UNESCO (United Nations Educational, Scientific and Cultural Organization) launched a free Interactive Atlas of the World’s Languages in Danger. The online edition is a complement of the print edition (3rd edition, 2010), edited by Christopher Moseley, and available in English, French and Spanish, with previous editions in 1996 and 2001. 2,473 languages were listed on 4 June 2011, with a search engine by country and area, language name, number of speakers from/to, vitality and ISO 639-3 code. The language names have been indicated in English, French and Spanish transcriptions. Alternate names (spelling variants, dialects or names in non-Roman scripts) are also provided.

***

In 2010, UNESCO (United Nations Educational, Scientific andCultural Organization) launched a free Interactive Atlas of theWorld’s Languages in Danger.

The online edition is a complement of the print edition (3rd edition, 2010), edited by Christopher Moseley, and available in English, French and Spanish, with previous editions in 1996 and 2001.

2,473 languages were listed on 4 June 2011, with a search engine by country and area, language name, number of speakers from/to, vitality and ISO 639-3 code.

The language names have been indicated in English, French and Spanish transcriptions. Alternate names (spelling variants, dialects or names in non-Roman scripts) are also provided.

# About language vitality

UNESCO’s Language Vitality and Endangerment framework has established six degrees of vitality/endangerment: safe, vulnerable, definitely endangered, severely endangered, critically endangered, extinct.

“Safe” — not included in the atlas — means that the language is spoken by all generations and that intergenerational transmission is uninterrupted.

“Vulnerable” means that most children speak the language, but it may be restricted to certain domains, for example at home.

“Definitely endangered” means that children no longer learn the language as a mother tongue in the home.

“Severely endangered” means that the language is spoken by grand- parents and older generations. While the parent generation may understand it, they don’t speak it to children or among themselves.

“Critically endangered” means that the youngest speakers are grandparents and older, and they speak the language partially and infrequently.

“Extinct” means there are no speakers left. The atlas includes presumably extinct languages since the 1950s.

# How to define an endangered language

When exactly is a language considered as endangered? As explained by UNESCO on the interactive altas’ website: “A language is endangered when its speakers cease to use it, use it in fewer and fewer domains, use fewer of its registers and speaking styles, and/or stop passing it on to the next generation. No single factor determines whether a language is endangered.”

UNESCO experts have identified nine factors that should be considered together: (1) intergenerational language transmission; (2) absolute number of speakers; (3) proportion of speakers within the total population; (4) shifts in domains of language use; (5) response to new domains and media; (6) availability of materials for language education and literacy; (7) governmental and institutional language attitudes and policies including official status and use; (8) community members’ attitudes towards their own language; (9) amount and quality of documentation.

What are the causes of language endangerment and disappearance? “A language disappears when its speakers disappear or when they shift to speaking another language — most often, a larger language used by a more powerful group. Languages are threatened by external forces such as military, economic, religious, cultural or educational subjugation, or by internal forces such as a community’s negative attitude towards its own language. Today, increased migration and rapid urbanization often bring along the loss of traditional ways of life and a strong pressure to speak a dominant language that is — or is perceived to be — necessary for full civic participation and economic advancement.”

Copyright © 2012 Marie Lebert


Back to IndexNext