Monday, April 23, 2007

Article 7: Censorship Ranks as the Top Internet Issue

Here is an article that elucidates Internet's biggest issue. The survey tracks demographics, trends of cyberspace and paints a plaintive image.

Memo 13: Internet Issues

When I think of Internet today, I remember a line: ‘With power, comes a lot of responsibility’. This line, I believe, holds very true for the Internet Society and associations like the IETF, IRTF and IAB.
The Internet is so powerful and pervasive that it has become the lifeline of modern human existence. Nothing could better articulate the stature and influence of Internet than the presence of a whole new virtual and secondary world, the most stunning replica of the real world and its aspirations, in the form of Second Life.

However, this has come with a price. There are some raging issues that Internet faces. Not only do these issues have the potential to jeopardize the noble intent of Internet but also can vilify its existence. I will address some of the many issues, in this memo.

Security
Front up is security. Although tremendous work has been done in this domain, there always are cracks in the wall. The most notorious ones are the DoS (denial of service) attacks. These are malicious attempts to deny legitimate users, their right to certain services. The impact varies from inefficiency of modern equipments to damaging physical security.

Privacy
Next, are the privacy issues. These essentially are caused due to some kind of security breach but have technical and legal appeal. The public nature of Internet has made it vulnerable to privacy issues. The very fact that the packet you send or intend to receive goes through alien nodes, makes privacy a major concern. Also, legally there have been cases wherein the definition of privacy in a given context has been questioned.

Pornography
Although, this is one factor that actually defines 60% of internet traffic, it raises serious concerns. There are several content providers that are involved in child pornography and that content floats unmonitored on the Internet. Also, it is very easy for people not legally permitted (due to age, national rules etc.) to access the prohibited content.

Copyright
Internet has been one of the best and worst sources of information. However, it has quite easily been most prominent reason for copyright issues and plagiarism. There is virtually no way at present to prevent attempts to copy material from the Internet and plagiarize content. One could minimize this by penalizing attempts and making use of tools like turnitin, however there is no asking in instances where such law isn’t enforced.

These were a few of the several issues that plaque the Internet today. There is constant work going on to curb these however there also is a constant effort to find a way around the rules, bend them and increase the necessity of such rules.

Thursday, April 12, 2007

Article 6: Telecom-Economy Nexus

A brief but beautiful report on the fundamentals and dynamics of the telecom-economy nexus and their mutual effect on each other.
http://www.lirne.net/resources/netknowledge/cho.pdf

Article 5: Antitrust's latest in CA...

Discover how an adamant judge in San Francisco disallows a media group's antitrust ways
http://www.reuters.com/article/bondsNews/idUSN1043240720070410

Memo 12: Telecom has got the Kangaroos hopping!

A country's economy is often shaped and adorned by the basic infrastructure that she wears. Not only is it indicative of a nation's fiscal health and well-being but is also a yardstick that enables the measure of its potential and prowess to develop further. In today's post-information age, the traditional infrastructure in the form of water supply, roads, rails, electricity etcetera has become the one that is extremely important to sustain the economic growth of the country. However, the information and telecommunication technology infrastructure has proven to be the one that actually drives the economic growth.
The human capital
This memo studies the impact of telecommunication infrastructure and services on the economy of Australia. One of the most amazing aspects of the Australian case is that the impact of telecommunications on its GDP has been becoming evident at twice the rate at which telecommunications has been actually growing. This has doubly enhanced the significance of telecommunications which has led to a further growth and development in the sector. The entire chain reaction has led to some explosive growth indices not only in the telecommunications sector but in the overall economy as well.
Starting first up, let us look at the employment scenario. A recent study by the Govt. of Australia showed that approximately 100,000 Australians work in the telecommunications industry and the rate (at which people are being employed by the industry) has been increasing every year by about 11%. These are some phenomenal numbers and clearly explain why Australians are the 4th most intensive users of information and communication technologies in the world.
This has a non-obvious but a very significant implication. Since the industry has been aggressively driving related employment, there has been a tremendous boost in the education and vocational training industry. More and more individuals have been investing in education and training of technological, financial and regulatory aspects of telecommunications in general. This, as a result, is driving a whole new parallel industry of hi-tech education and training which now is responsible for 12.8% of country's GDP.

I drive Me
Second, the industry has been responsible for its own growth. Unlike the conventional infrastructures, a telecom infrastructure's value and economy increases exponentially with the increase in the no.of users of the infrastructure and with the no.of services offered over it. So as more and more people and services have attached them to the network, the increase in significance and value of the network has led to an increased investment in equipments, hardware, software, economic and policy analysis of telecommunications. This has led to an increased maturity and research in each of these sectors and hence led to creation of markets in several cases. For instance, SMS and MMS related services and advertising comprises of almost 33% of revenue for an average telecom operator in Australia. This has further attached people and services to the network in general and hence the industry has been driving itself.

Side Effects..
Also, the telecommunications industry has been aggressively driving labor productive and labor intensive growth across diverse industry domains. Over the past 5 years there has been an indirect impact on the growth of some of the basic industries. This has mainly been due to facilitation of enhanced efficiencies in day to day businesses, communication costs, enterprise networks etc. Adjacent graph illustrates the impact:




The phenomenon
I can actually go on and on describing the impact of telecommunications on Australia's economy. It is just phenomenal. The brief indicators described in the memo give an overall picture of the nature and potential of the impact. Also estimations of the future impact look even more encouraging.
“ - Boost in the national output by 3.7% over the next decade
- increase real investment by 4% and consumption by 3%
- increase employment by 5.3% and ral wages by 4.44%
- contribute 3% to an appreciation of real exchange rate” [1]
As evident, the value of impact telecommunications has had n Australian economy can only be realized by studying the impact of its value.

References:
All figures and statistics and [1] taken from “Impact of Telecom in Australia” by Australia Telecom Society, Center for Strategic Economic Study, Australia 2006.

Wednesday, April 4, 2007

Memo 11: Anti-trust ? ... EchoStar-DirecTV case revealed Washington's primary concern

Indulgence often defeats the supposed intention and causes unforeseen repercussions. Legal practices and the policy precedents sometimes force me to relate them to the ugliness of human intervention in a natural ecosystem. Several instances have now been testimony to the fact that an attempt to regulate the market so as to make it conform to the modern day ethics of competition, has actually jeopardized the very purpose. Many times, just as natural processes take care of the balance in an ecosystem, market forces and trends should be allowed to dictate the dynamics of market change.


The Opposition to Merger
Four years ago, vehement opposition disguised in the attire of anti-trust halted the DISH TV and DirecTV merger. This was strongly made possible by the policy makers' clout in the FCC. There were several factors that stopped the $26.2 billion combination of nation's two biggest direct broadcast satellite players.[1]
The primary argument was that the 'hostile' take over by EchoStar Communication's DISH TV of Hughes Electronics' DirecTV, will give it a market share of 91% making direct broadcast satellite market a virtual monopoly. Second, Rupert Murdoch's News Corporation that sells programming like Fox News and FX to both cable and satellite companies, lobbied with the Department of Justice to stop the merger on anti-trust grounds.[2] "Also opposing the deal was the "Western Caucus," a bipartisan coalition of members of Congress representing primarily Western and rural districts. According to a letter sent to U.S. Attorney General John Ashcroft and Michael Powell, chairman of the Federal Communications Commission, the group’s members had grave concerns that such a merger would increase costs and decrease options for their constituents who wanted direct broadcast satellite television service. The result for rural America, they wrote, would be a monopoly with essentially no hope for future entrants." [2]


Anti-trust camouflage
The reasons above clearly indicate an immature handling of the issue on the policy front, supported by some irrational arguments. The primary argument of 91% market share overlooks the fact the merger would constitute a mere 20% of the overall television broadcasting market where cable, and now the phone, companies enjoy a comfortable position. This would actually increase competition and force the relatively complacent cable companies to pull up their socks to stay competitive.
Second, Rupert Murdoch's cry had a different cause altogether, than that projected. First, reduction in competition in the satellite segment would naturally erode his corporation's profits in selling the programming. Second, Rupert Murdoch had lost the bid to buy DirecTV to EchoStar.
Third, rural America would naturally prefer service than the lack of it. There are several places where cable companies wont dare to even think of providing services purely due to economic infeasibility. Satellite TV would be the only possible alternative and EchoStar will obviously want to keep the pricing competitive and in sync with that offered to urban counterparts so as to tap the rural market completely. Also, the provisioning of services at rates competitive to that of cable companies in urban markets would actually up their profit in the urban segment (as satellite TV broadcast becomes more profitable in an area of increased density) and help them cross subsidize their rural services. In an essay published in the Rocky Mountain News in late January, Wallop wrote, "The satellite industry has a strong track record of serving rural areas, not with promises but with programming".
Apart from this, interveners should base their policies in a technology specific environment of reasoning. Satellite segment is ten times as risky as the cable or the telephone segment. Relatively, lot of investment and technological expertise goes into a satellite based services. This demands a soft corner for players in this segment so as to encourage future entrants and hence competition. In an open and competition encouraging market, fears of a monopolistic domination no longer hold much appeal. Entrepreneurial vigor and technology often naturally balances the market forces. For instance, just when regulators were pushing hard to discover ways to share the copper network, along came the vibrant cellular phone technology. It has now virtually and unknowingly created one of the most ideal markets upholding all the ideals of capitalism and competition. In a similar way, there are some very dynamic developments taking place in the satellite segment too. For instance, "wireless telecom entrepreneur Craig McCaw has been busy funneling billions into satellite ventures such as ICO and Teledesic. These satellite constellations would offer ubiquitous high-speed service across the globe and have attracted an impressive group investors, including Microsoft founder Bill Gates. And let's not forget about wi-fi and ultra-wideband which could revolutionize the way we communicate". [3]

The Real Concern...
Hence more than anti-trust, I believe the policy makers and the guardians of legal sanity need to concentrate on anti-rust because decisions in the recent past have displayed a rust in their rational and decision making abilities. As Adam Thierer, the Director of Telecommunications Studies and Clyde Wayne Crews Jr., the Director of Technology Studies at the Cato Institute in Washington, D.C. say, "regulatory ethos today are wholly corrupt and completely at odds with foundations of a free and just capitalist system. Sadly, morality-based arguments don't get very far in Washington these days".

References:
[1] Barthold, Jim "FCC rejects EchoStar-DirecTV Merger" http://telephonyonline.com/news/telecom_fcc_rejects_echostardirectv/ October 2002
[2] Bast, Joseph "Farm Bureau endorses EchoStar/DirecTV Merger" http://www.heartland.org/Article.cfm?artId=399 May 2002
[3] Thierer, Adam and Wayne, Clade "EchoStar-DirecTV Merger Critics Propose Infrastructure Socialism in Outer Space" http://www.cato.org/tech/tk/021008-tk.html October 2002

Monday, April 2, 2007

Memo 10: Google Doc(k)s In....

Competition has always driven innovation. But not all innovation drives competition. Only the ones that have a greater appeal to the masses and the ones that are not drastically radical have more often found success. People have always been slow and weary to react to change. It has always been observed that when something new is associable with its predecessor and the change is evident in its improvements, it is much easily and quickly adapted than the one that cannot be related to at least by the multitudes.
One such example is the area of word processors.


Microsoft has ruled this sector for decades now with its almighty MS Word. It has been a marvelous word processing software and has served people quite loyally for quite some time now. Although we keep on cribbing about Microsoft products, its anti-competition policies, its shortcomings etcetera but we all use Microsoft products and especially when it comes to Office. The simple reason is that the MS Office products are simple, easy to use, very effective and a kind of must have.
MS Word is not free and that is probably the major reason why a few people (believe me, they are very very few in number) do not use it. Its alternates like Open Office Text Document (by Sun and free) and Corel Word Perfect (not free but gives a large enough free trial period) are ridiculous replicas of MS Word. (even the short cuts are similar) This talks a lot about the effectiveness of MS Word. Since people have been using MS Word for quite some time, its alternates have to be pretty similar to facilitate easy migration. Also, MS Word is very user friendly and trying to make something different and complicated will not appeal to the common man.

There is, however a new player. Google Docs and Spreadsheets is the new entrant. Unlike MS Word and its counterparts, it is conceptually different as it is a Web word processor Instead of the document being saved in your hard disk, the document gets saved in the server under your Google account. This is very consistence with Google's ideology of shifting computing away from the individual to the central grid. Hence hard disk crash does not affect your important documents. Also the document is available from any place at any time on any machine with Internet access.
Secondly, the interface is very much similar to that of MS Word. That is, considering the basic editing tools like cut, copy paste, undo, spell check, font, paragraph etcetera, Google Docs can be easily related to . However these functionalities are very basic in nature and are not as comprehensive.

However, the usp of Google Docs is the ability to collaborate. Since the documents are online, one can invite several collaborators to work on the same document simultaneously. This is something very new and very useful especially today when the corporate culture relies so much on collaborative and group project reports. Also the document gets auto saved periodically.

So with all the great features and innovation, is Google Docs the next generation word processing software? Is it all set to replace MS Word?
Well, it has a long way to go. Its collaborative feature loses appeal when more than 9-10 people work on it simultaneously. The response is very slow, lagging and not comfortable to work with. Secondly, Internet access although much much easier than the past, isn't yet an obvious aspect. Consider travelling on a plane or in a bus where you have time to work on a document and Google Docs wont allow you. Also, it is psychologically convenient for people to go to MS Word for a document than to Google Docs as to go to the latter, you connect to the net, go to Google, sign in and then start the document.
Also, it does not incorporate all the features that MS Word has or one would like a word processing software to have like tables, formulas, rulers, headers footers, page layouts etcetera. It is very difficult to integrate pictures, graphs etcetera into the document when compared to MS Word and its counterparts.

Hence, I believe that Google Docs is certainly not an innovative replacement for the orthodox word processing software. However, it is a great concept that encourages centralized, collaborative and secure computing. (I believe, contrary to common notion, PCs are more vulnerable than servers and online storage) Also it has a great scope for plain text based applications that require multiple access and back up.

Saturday, March 17, 2007

Memo 9: Get a life...well, a Second Life!

Never before has virtual reality been experienced more vividly. Human imagination, fantasies and desire for utopia, nearly, have been stretched to its maximum in an effort to capture a digital replica of mortal existence. Second Life is a creative innovation in this direction. It is a dynamic world like the real world, wherein an established presence has the ability to explore a realm of possibilities never considered in reality.

So what did Philip and Cory from Linden Lab, the architects of Second Life have in mind while creating it? I believe they wanted to give creativity a whole new definition and haven't they done that. With a massive conglomeration of server farms that support the simulation of a large number of objects created every moment and then there is a refreshing domino that re-establishes state which appears the same to everyone on the outside. So there is massive activity going on in co-ordination on the server side of the story. This has made Second Life an economy and a geography as big as Monaco. People build objects, own IP for their own creation, market their products and make money.

So is Second Life an auxiliary platform for a mere secondary business opportunity? Yes it is but it is not restricted to that. Huge corporate houses have established Second Life presence. Not only are they evaluating the scope and potential of Second Life but also using it as an experimentation laboratory of their business models and product tests. There are educational and training workshops conducted on Second Life which is really an awesome phenomena. There are universities, colonies resorts, labs, and almost everything that one would need to guage the response of a test in a real life environment. And the fascinating thing is that it is evolving at a very fast rate.

After having signed up in Second Life and downloaded the necessary software, I was ready to venture into a whole new world. At some point of time, I knew I had to buy the Linden dollars, the planet's currency, to get more mileage out of the experience, but initially I was happy to explore the new place. The most refreshing thing, which is only a theoretical possibility now, was teleportation. I had the entire geographic expanse at my disposal to choose from,and get there in no time and with no money. So I skipped the orientation and reached an open fair on the diametrically opposite side of the map. There were people and lots of people. All had weird names and some very funky avatars. I, being a novice, felt that I too needed an image makeover. So I went on to change my avatar, a thing I will not be able to do in reality so quickly. Ready for the social embrace, I interacted with a few people. Some were the residents since quite some time and some very new to the world like me. It has a cool feature that one would desire to have. Before going and approaching an individual, one can research the person's profile and accordingly draft the conversation. After a bit of loitering around and getting a feel of the environment, I decided to get to some serious business.

I went through the orientation or the introductory workshop and got a better understanding of the new world. I planned to back the orientation with some 'real' experience. I signed up for a free workshop on Second Life control tutorial. It was an amazing experience where I learned how to create objects and basic building blocks for bigger structures. This is the first step towards establishing presence in Second Life. From here on I realized that with a creative appetite, one could do wonders. It is like having the power of the creator himself. You can build your own world, your own gadgets, empower them with all th desired features, use them sell them and leverage a plethora of business opportunity. One can build one's own social network with a unimaginable diversity to exploit. I am actually clueless with regards to the possibility, opportunity and the unfathomable scope of this digital mania. The more I talk about it the more am I going to sound shallow and illiterate as Second Life is evolving continually and rapidly, also at this very moment that I am writing.

I personally have not spent much time on Second Life but from what ever little I have done, I am totally in awe of this thing. Getting used to does take a bit of time and patience but it is worth the efforts. I believe enhancing ways to make the interface easier, making instructions more readily available will bring in more and more residents. The concept needs modification and adaptability to the origins of the growing user base. There are un-addressed issues like impersonation sabotage, virtual vandalism and so on. However, I can relate to SLosphere as a land of opportunity and a canvass of human creative exhibition.


Sunday, March 11, 2007

Memo 8: Casting Pods....

The Internet continues to articulate human expression with with its ever evolving definition of services. The newest in the basket is podcasting. This novel way of content delivery and access, is a step towards redefining the concept of press. We now have interesting and non-conventional opinionated reviews on every possible topic, by self-acclaimed experts, distributed across the net. People or the potential subscribers, try to build up loyalty towards a particular podcast portal, (Apple iTunes podcasts, Yahoo! podcasts, CNN pods etc.) for a particular subject and access these 'information' bites. So gradually we are seeing an evolving phase where press is becoming not only for the people and of the people but also by the people.

I have now party to this world since past 3 weeks only and feel that its a crazy world out there. Some are really interesting and informative but some podcasts are pathetic to the core. You sometimes get a feeling, that this innovative and a potentially revolutionary exercise can loose its flavor if one allows junk to thrive on the podcasting arena.


The Misses..

I will start with my nightmarish experience as it is always good to end on a nice note. I will not rank the sad podcasts as all of them were equally disheartening.

The first, 'Night Clubbin in Berlin' from Street Fury. This is a video podcast that starts with a man wearing dark glasses and hopping around Berlin's night clubs, talking to people at midnight. He only knew what he could see as the people he met and the questions he asked made no sense. Instead of telling more about Berlin's night life, it seemed like Discovery channel's night vision videos where camera men keep following the beasts.

The second, Horse Riding and Handling through Field by Leslie Desmond. This audio pod was supposed to introduce the audio book, but had nothing that even remotely did so. Throughout, the only thing that one could associate with the topic was neighing of the horse. The author then attempts to talk something about the what horse can be first thought, but it was so muddled and out of context that it seemed that while the pod was being recorded, the horse was riding her rather than she riding the horse. The video then again ends with horse's neighing to remind us that the pod had something to do with the horse.

The third, 'Square Talk ..Three teens talk on Economics'. The impressive title drove me towards the pod. When I started listening, first i could hear only one person. Second, by no way he seemed a teen. Third the only economics that he talked was that he was to get back $10 tax refund from the IRS which he would use to gift a watch to her girlfriend who always turns up late.

These are some of forgettable experiences that I have had with the podcasts. It can sometimes turn out be a place, especially on a bad day, that is bent upon psyching you out.


The Hits...

Lets move to the greener side of the story.

The 'Amateur Traveler Podcast' is one that no travel freak would want to miss. The narrator gives one of the most exciting descriptions of the places that he visits. The impressive thing about the pod is that being just a audio pod, the narrator is actually able to pull up a panoramic visual of the place before the listener. The sounds in the background, interviews with people, his descriptive personal touch all sum to a very lively experience.

Second, 'Money Tree Podcast by Martin Bamford' is a relish as far as financial advise is concerned. He addresses various personal finance issues in a refreshing manner. Especially, for people like engineers who want to know little but enough about finance, this is an apt place. The concepts are presented in the most succinct yet sophisticated manner.

The third, 'GIS and Location Technology', I must say is an excellent pod. Its a technology pod, and talks about one of the most challenging technological ventures taken up by Farallon Geographics. It talks about the dynamics of global location technology. Right from RFID for article location to GPS for people location and associated aspects. It also talks about project's challenges, invites comments and integrates reasoning in description.


These were my opinions on a few of the many pods that I have been exposed to in the past 3 weeks. Podcasting opens up a whole new avenue of unbounded expression. It is for the subscribers to choose their level of exposure to the worst of the bests or the best of the worsts.

Monday, March 5, 2007

My Wikipedia Articles

I have posted 2 articles on Wikipedia:

The first article is an edited version of my article on Chile's Telecom Liberalization. You can see that on http://en.wikipedia.org/wiki/Epitome_of_Telecom_Liberalization:_Chile
One can also find this article's link under the 'See Also' section on Chile's main page on Wikipedia.

The second article is on 'Equanimity'. When I searched for 'equanimity' on Wikipedia, I just got the word's meaning. Since I had written a related article, I posted it just to see the response.
I would encourage you to visit http://en.wikipedia.org/wiki/Equanimity as it talks about a less known concept in the realm of human psychology.

Memo 7: Encyclopedia Redifined


The advent of information age has been marked by plethora of innovative and creative platforms of access. As the evolution continues, it is not the content, but the ease of its availability, authenticity, comprehensibility and most important, the lucidity of its presentation, that takes primary focus. As a result, the source that caters to as many aspects, as mentioned above, becomes more and more popular. This is precisely why we are witnessing a novel encyclopedia experience in the form of Wikipedia.

Wikipedia today boasts of more than 5 million articles in 100 languages covering expanses of discussion over the widest possible canvas of human deliberation. It reflects a collaborative effort of information origination, modification and presentation open to potentially every mortal. The encyclopedia is based on fundamental principles (see Five Pillars of Wikipedia: http://en.wikipedia.org/wiki/Wikipedia:Five_pillars/) of consent wherein content derogatory to a subject's image isn't allowed. Also, there is strict adherence to intellectual property rights, laws of copyrighting, and those against plagiarism. There is continuous content management with regard to nature, accuracy, legal aspects and most importantly, temporal relevance. So where does the problem lie?

Wikipedia's credibility problems have been attributed to its openness. The very fact that it is open, makes it vulnerable to each of its assets. Who are the ones that manage content? Who are the ones that contribute? Who are the ones that access it? Wikipedia does explicitly define 'contributor' (one who has successfully edited atleast 10 articles) and 'qualified editor' (one who has authority to edit any article) but it is still an open norm devoid of authoritative legitimacy. So is Wikipedia a credible source?

Mr. Credible
Credibility of a content scales with its usability and acceptance. According to a recent survey by CNET news.com, Wikipedia is the 36th most visited site. Now, if we weed out pornographic sites and popular portals, Wikipedia is certainly up there. Also, it has got much higher hit rates than any other online encyclopedia. Such a large number of information seekers cannot be compromising for incorrect content. What drives people towards Wikipedia, is that it is free to access. If you have not signed up for Britannica (paid membership), most of the articles and subject matter is inaccessible. In today's age, one cannot afford to be paid for information. Secondly, every subject is covered in a lucid, systematic manner right from background, introductory description, special features, pros cons with links for every related word or topic. This means, if a reader gets stuck somewhere in an article over a related but an unknown word, he can click on its link, pull that word's page understand it in the related context and get back to his article. These features actually help someone like an engineer to comprehend the topics on law. As a result, Wikipedia content is used and accepted much more than any other source.

For a simple example, when one types ‘telephone’ in the search box of both Wikipredia and Britannica, here is what you get:


Now, as a user who does not want to pay, Wikipedia certainly is much more appealing. Secondly if one observes, Wikipedia goes about the topic in a very systematic manner right from layman’s version of telephone to IP phones. On the other hand, Britannica (although it’s a preview of the main article) starts with a basic definition, and then makes it complex and jumps to Bell’s patent. Also, it does give an outline of the actual article (that is accessible to only members) so as to try and lure users towards the membership.

Defining Correctness........
Now, talking about content's academic correctness. In a comparative study conducted, by the Nature Journal, on the quality of content between Wikipedia and Britannica articles, it was found that out of the 42 articles compared, four of Wikipedia's articles had minor flaws as against Britannica's three. As far as fundamental errors were concerned, Wikipedia had two as against Britannica's one. This clearly shows that even the so called 'mother of all information sources' is not flawless. Also correctness, to an extent, is determined by the context and purpose. For instance, a math book for grade 2 may use the terms speed and velocity interchangeably whereas a math book for grade 7 will distinguish between speed and velocity as a scalar and a vector quantity respectively. The grade 2 book is correct too, in its context wherein it means rate of motion. Hence in an attempt to articulate explanation in a manner that is universally easy to comprehend, Wikipedia's articles might occasionally be 'inaccurate'. But, given the results of the study, Wikipedia certainly cannot be considered a source that is not at all credible. Certainly one wants to think twice before using it as Wikipedia's articles lack responsibility.
One must give credit to the Wikipedia's loyalists who continuously monitor content so as to try and maintain sanity. As a result, most of the articles are updated. Secondly, every article has a room for improvement as each of them can be openly edited, within the correctness parameters. This, more often than not gives a particular subject a fresh perspective. Encyclopedias like Britannica cannot incorporate this creativity and dynamism as the subject is addressed by scholarly conventionalism. Wikipedia's approach adds color and diverse dimensions which not only broadens the scope of a particular subject, but also enhances its scope of usability, acceptability and hence credibility.

Its not Wikipedia or Britannica....Its Wikipedia and Britannica
Occasionally, mishaps like the Seiganthaler and Curry situations do happen, but I guess, Wikipedia as an effort toward information's dynamism, should be given due chances. There are a lot more positives in the evolution of Wikipedia. What people do not understand is that Wikipedia never said that it be replaced by Britannica. It is people who want to compare the novelty with conventionalism and hence take sadist pleasures of proving the obvious. I believe, Wikipedia is a totally different platform and should not be viewed on the same lines as Britannica. If one is looking for scholarly material at the risk of probable difficulty of understanding it, Britannica is the place but if wants to get a generic idea of a totally new concept, Wikipedia proves much better. It has its unique place in the world of organized information and should be used in the manner its nature demands, so as to conceive its credibility.

Sunday, February 25, 2007

Article 4: Back to School

Ever wondered how difficult is it to keep up to the growing technogological demands of schools, teachers, students and the educational community? Here is an article that highlights these less discussed problems: http://lrs.ed.uiuc.edu/wp/access/funding.html

Memo 6: "Hello, emergency?....this is 911..I need help.."!


9-1-1 needs emergency help! It is not the the technology but the lack of regulatory vigilance, administrative inefficiency, executive indiscipline and ethical misbehavior that have now led us to a situation where we need to fund the funding of the E911 fund.

A brief background
The significance of providing emergency service through a common number all across the nation, reveals some interesting aspects with regards to technology, economics and policy surrounding a national telecom infrastructure. In US, all the states levy a 911 surcharge on a per line basis. On an average, in the US, a monthly charge of $0.70 per line has been mandated. This helps create a strong pool to support funding of E-911 services which include
enabling the call routing, caller identity and location information technology, establishing and maintaining PSAPs (public safety answering points) and notifying the appropriately located and designated authorities promptly. This works fine when we consider only the legacy wireline telephone services as technologically, things are very simplified. Now, as parallel communication technologies evolve, one has to device ways to inculcate the same emergency service provision. The FCC then mandated that all technologies including wireless and the VoIP telephony, should incorporate this aspect of the service.
As far as providing wireless 911 services is concerned, it is technologically feasible. It has been done in 2 phases. The phase I deployment involves dealing with the routing technology. It has been achieved in 80 % of the states in the US. The phase II deals with location identity and its notification mechanism to the nearest PSAP. Till date it has been made completely functional only in 8 states.

The problem
Prof. Dale Hatfield describes the 911 issue as "a wonderful systems engineering problem. It's really a perfect example." In his Report on Technical and Operational Issues Impacting The Provision of Wireless Enhanced 911 Services, Prof. Hatfield points out that lack of coordination among the various stake holders has delayed the issue and as a result there are now questions with regards to its funding. Initial delay can be attributed to service providers that relied on the FCCs recovery clause. In a way they were justified because the cess was collected by the respective state governments' advisory boards in an effort to provide funds for provision of the service. The advisory board now has the onus to distribute the funds efficiently among the carriers, PSAPs and the LECs. Here lies the problem. Inefficient mobilization of the funds, as a result of lack of initiative and lethargy displayed by the FCC, has made funding E911 services, an issue. The funds have been collected quite efficiently, it is redistribution that is the primary issue. Inefficient mobilization of the funds, as a result of lack of initiative and lethargy displayed by the FCC, has made funding E911 services, an issue. We are now facing a black hole situation wherein all the money that has gone in, because of some administrative and regulatory lapses, is not actually coming out!
Here is one reason, “Reports from the General Accounting Office (GAO) and the National Emergency Number Association (NENA) showed that several state legislatures had diverted E911 funds to shore up budget problems elsewhere. The NENA report identified a dozen states--Arizona, California, Maine, Maryland, New York, North Carolina, Oregon, Rhode Island, South Carolina, Texas, Virginia and Washington plus the District of Columbia--that diverted as much as $400 million away from E911.”[1] Also,as Ken Louden, director of communications for Steuben County, Ind.,who serves on the Indiana wireless E911 board, says, “The FCC has not stepped up to the plate to police the LECs”. This has been frustrating the PSAPs too LECs are disallowing them from accepting calls.

Proposals:
In this chaos, we are sidelining the fundamental aspect of 911. Come what may, every citizen, irrespective of the technology he uses, must be able to access the service efficiently. Service provision is the key requirement. On the administrative front, people, who pay the surcharge and are still deprived of what they pay for, need to be more proactive in pressurizing the respective state governments. Public awareness campaigns, political pressure, etc. are some ways people need to adopt aggressively as no one realizes the importance of 911 unless one faces a situation where it has to be used.
On the technological front, network based technology, rather than a device based technology, will be more efficient and economical. The simple reason being that one cannot expect to modify end systems continually as the technology evolves. Instead, modifying the network for once will solve the problem and will prevent revisiting the problem with every upgrade in the access technology. Also, if one considers the economics, handset-based technology requires phasing on an individual basis which is not only time consuming, but also an expensive option as far as the end consumer is concerned. Network technology, on the other hand, immediately reflects the upgrade and service incorporation and provides an array of opportunity for an entire content based market (safety instructions, nearest help center directions, safety ads etc.). This will automatically provide a self -sustaining mechanism.
On the regulatory front, mechanisms need to be evolved to induce transparency into the entire process of fund management. Accountability cannot be neglected and misappropriation should not be permitted a camouflage of inadequacy.

References:
[1]Chamberlain, David - Law makers to blame for E911 scandal on http://www.findarticles.com/p/articles/mi_m0DUJ/is_14_108/ai_n9505612

Article 3: Sun's 'big conversations' with the telecom industry


Discover how the biggest open source loyalist, Scott McNealy of Sun, is using his baby, the WebTone Switch, as his international business strategy tool to lure the global TPP (Telecommunications Platform Providers) towrads open standards. His 'big conversations' with the telecom industry, is driving in investments into Sun's strategic attempts to introduce open standards into this 'huge computer called the telecom network'.
for a detailed article on Sun's WebTone swith and telecom business and investment plans.

Sunday, February 18, 2007

Memo 5: International Investment & Business Model for Universal Service

“It's the attitude, not the aptitude, that decides the altitude”. - Anonymous
In every developing and developed nation's telecommunication jargon, 'universal service' is one phrase that is captivating attention. What is disappointing though, is the attitude that is associated with it. Most of them relate to USO as 'Universal Service Obligation'. Can't we approach the acronym USO as 'Universal Service Opportunity' and give solution a better chance over convolution?

In an era of international telecommunication liberalization, I would, through this memo, like to explore universal service as an opportunity for leveraging international investment. This means that telecom regulators, with a strong mature policy, can come up with a business strategy to lure international investment to promote universal service. On the other side of the story, international telecom companies can come with an investment plan to articulate universal service as a creative entry mechanism and hence explore a potential business breakthrough.

The biggest motivation for inviting international competition in this sector, is the inefficacy of the cross subsidy model. I believe that generation of funds via a cross domain accessibility revenue model, gives the government a much better chance to materialize universal service. To clarify this, I have developed a naive model: 'proctorial expansion model'. It has two procedures: the inside-out procedure and the outside-in procedure.

The Model

We can visualize the entire telecom domain as a set of 2 concentric circles. The inner circle is the rich circle, conceptually comprising of an aggregation of rich telecom services provided by several players. The outer void circle is the opportunity circle, the target of the service. In the inside-out procedure, the government encourages every service provider in the inner circle to expand its services to an adopted sector of the outer circle, by cutting down on the taxes charged for providing services to the inner sector. This, I believe, is a strong enough incentive for the service provider because it is offered with a massive cost advantage in a fiercely competitive domain. The service provider has an opportunity to steal some substantial chunks of the 'rich' market share by way of lowering its service costs. Every fraction of the outer sector expanded, is rewarded by the government in the form of further tax deductions. In the outside-in procedure, the government gives every potential new telecom player mega incentives to enter the inner circle, if it promises to proctor some portion of the outer circle. To boost these further, the regulator can device methods which entitles every other industry, that has a market in that particular sector of the outer circle, to use the telecom services only of the proctor of that particular sector. Although this tends to create local monopoly, globally it is still a competitive space. If a competitor is able to proctor a sector in a more cost effective manner (for the sheer anticipation of a government tax incentive), he will definitely try and procure it, which will force the existing player to reorganize its cost model and not take advantage of the monopolistic status. After all, you need to know monopoly, to know competition !
By doing this the national telecom regulator is mobilizing funds, in an automated manner, towards universal service. The onus is now on the existing and the newer players to device methods to adapt to the model, in a manner that is most feasible to their economics and logistics.

Moving to the other side of the fence, there are some real opportunities for the telecom companies, especially international ones, to take advantage of the universal service jack. The international competitors pay some serious taxes and still manage to compete because of superior service quality and business models. So as an international player, if one comes up with an investment plan that satisfies the government's goal of providing service to the outer circle, in a cost effective manner, it can exploit the advantage of tax deduction for providing services in the inner circle. Now the economic gains of this (by being able to lure a portion of highly profitable market towards yourself) can actually more than compensate for the probable loss one might have to incur in the process of providing service to a 'poor' market. The company can judiciously re work its call and associated service cost model in the competitive inner circle, to take utmost advantage of the government incentive. However, I feel that master stroke will be to work out means of providing cheap service in the outer model so that even if there isn't much of a market to exploit, the return on investment is still in the positive.
On the technology front, using WLL services is a potential economic option. Here the spectrum allocation charge does not come into question because it is going to be a government backed option. Secondly backing technology that uses the vintage power-line networks to provide telecommunication infrastructure is a big opportunity for resolving the issue of universal service.
Strategically, as a proctor of the adopted sector, one can invest on efforts to develop or enhance the brand equity of your sector as a potential market for suitable industries. Every new industry lured towards your sector for a potential market, will require a telecommunication infrastructure to further its cause. As per the policy, you will be the telecom solution for all the potential industries. One can push the creative pedal to boost the economics. What is the motivation? The national regulator's patron which will help you make your venture more and more profitable.

The entire hypothesis delineated above is a attempt to neutralize my conceptual model of an inner white circle (area of rich service) surrounded by an outer black circle (area devoid of service), to a uniform single grey circle (area of universal service). This is practically a difficult proposition and the model needs refinement but I guess this can insinuate towards a better self-controlled approach that envisages moral and social responsibility to both sides (the regulator and the players), and help them collaborate efforts towards accomplishing the goal of universal service.

Monday, February 12, 2007

Article 2: Wanna Migrate ?


In a world of visas and immigration issues imposed by mortals, in an attempt to curb the natural human tendency to go places, ever wondered if there is a single international organization that facilitates migration globally....well here it is: International Organization for Migration

To know A to Z of migration (well..in a manner slightly different from the one shown in the picture above...) visit http://www.iom.int/jahia/page1.html

Enjoy !

Sunday, February 11, 2007

Memo 4: Epitome of Telecommunications Liberalization: CHILE


The headquarters of Compañía de Telécomunicaciones de Chile S.A , a marvelous sky rocketing structure overlooking Santiago's Rio Mapocho, very vividly symbolizes Chile's thriving telecommunications sector. It speaks volumes of a socio-economic revelation that has stemmed from one of the best seen implementations of an international agreement and has now become an inspiration in the process of global telecommunications liberalization.
Chile can be easily called the prodigy of modern telecommunications era. In an article by Larry Luxner (Tele.Com, May 2006), Wayne S. Alexander (Executive, SBC International) says, “Chile has one of the most open and the most competitive markets in the world”. As a result, it naturally has one of the best and cheapest available technologies at its disposal and is actually providing class to its masses.


The Beginning
Chile has pioneered the cause of privatization and liberalization of communication services. The process began way back in 1970s with the introduction of National Telecom Act in 1978. It allowed for “licensing of telecommunication services in the country and also led to formation of the Subsecretaría de Telecommunicaciones (SUBTEL) , the agency that would be responsible for regulating and managing the industry”.[1] SUBTEL, in collaboration with several industry players radically changed the look of telecommunications in the country through the chronology of events as shown.


1979: Liberalization of long distance telephony
1981: Government monopoly over telegraph and telex services abolished
1992: Local telephony market liberalized


The timing of these events clearly elucidates the maturity and vision of the Chilean telecommunications policy evaluators and regulators.
It received a further boost from the February 1997 WTOs Basic Telecom Agreement. This is the most significant event to have taken place in the history of international telecom, as it set the pace for initiating liberalization of the services sector globally. In addition to the "General Agreement on Tariffs and Trade (GATT)," which regulates the trade of products, "General Agreement on Trade in Services (GATS)" was also established. [2] The ITU enlightened the WTO of telecom's dual role as a fundamental network means for economic activities and also as a distinct sector of economic activities itself. It was this realization combined with the gradual evolution of WTO during that period, that led to the above agreement.
Few of the major outcomes of the agreement:[3]



Liberalization became the fundamental principle in the area of telecommunications, on a global scale, including developing countries.
Through global liberalization of the basic telecommunications services such as voice telephone services, it aimed at the introduction of competition, the reduction of service rates, diversification of services and provision of universal services.
69 of 131 member nations (including Chile) adopted the agreement.
the Government of Chile committed to market access and national treatment for long distance and international wireline and wireless telecommunications services, including satellite services, by January 1, 1998



The Process
The WTO Basic Telecom Agreement acted as a catalyst and led to an explosion in Chile's telecom market. In lieu of the agreement, four reform models were adopted by countries globally: [4]



Model 1 : Privatization with full competition - NZ, Chile, Malaysia
Model 2 : Privatization with phased-in competition and regulation - EU, Japan, Australia
Model 3 : Liberalization without privatization - India, Colombia
Model 4 : Private sector participation without privatization or liberalization - China, Saudi Arabia



As anticipated, Chile threw open its markets to international participation so as to induce fierce competition among local and global players for the telecom equipments and services sectors. Chilean telecom market became a battleground for national service providers like CTC, Entel, Chilesat and international giants like Telefónica de España (Spain) and Stet (Italy) as well as BellSouth, Motorola, Qualcomm and others from the U.S. [5] The most significant factor that supported this revolution was once again the dynamism shown by the telecom regulatory authority, SUBTEL. The various associated policies evolved at the same, if not faster, rate as the emerging technologies. Dynamic spectrum allocation, calling party pays etcetera were some of the most innovative exhibitions of SUBTEL, that helped Chilean telecom keep pace with a fast developing global industry.


The Impact
The natural result was an immense improvement in the quality of services across various domains including legacy wireline telephony services, wireless mobile communications (including the personal communication services sector) and the Internet. This catapulted a chain reaction of price reduction, market explosion, competition enhancement and technology development. The intensity of the impact can be gaged by this statement from David Allen (The WTO Telecommunications Agreements: Policy between Trade and Networks), “30 percent reduction in prices for international calls from Chile to the United States led to a 260 percent increase in traffic volume on that route”.
Chile's telecom revolution justified, in every sense of the word, WTOs Basic Telecom Agreement goals. Today, “Chile's market for telecom equipment and services stands at nearly $6 billion, with annual growth over the next five years projected at 20-25% -- three times the global telecom growth rate. The Chilean phone network is already 100% digital, and carriers are tripping over themselves to install synchronous digital hierarchy (SDI) fiber optic networks from one end of Chile to the other”.[5] It has the highest rural teledensity in Latin America, truly accomplishing the ideals of universal quality service. The following graphs indicate some of the healthiest growth indices, across diverse telecommunication services, in the world.

Graphs modeled based on information from http://www.ita.doc.gov/td/oepc/Telarg.htm

Most significantly, this has kept Chile's GDP in a healthy shape since the telecom industry contributes an average of 14% to Chile's GDP.[6]

In totality, Chile epitomizes an ideal telecom liberalization scenario. There are lessons to be learned as to how a proactive policy facilitated by an international cooperation and backed by a sharp business acumen, can really boost a nation's core infrastructure industry and hence the nation's economy, in general. As Dean Alexander (director of Grant Thornton International's business center in Santiago) says, “The telecom sector in Chile will continue to expand for a number of reasons, including a lack of market entrance barriers; convenience for multinationals world-wide to set up their operations in Chile to manage the Latin American region; continuous upgrading of existing equipment and networks; establishment of strategic alliances between foreign investors and local partners, and the willingness of Chileans to adopt state-of-the-art technologies."



References:
[1]Telecommunication Regulation and Liberalization, http://www.american.edu/initeb/em0017a/Regulation.htm as on 2/10/2007.
[2] WTO Basic Telecom Agreement, February 1997, Outline
[3] WTO Basic Telecom Agreement, February 1997, Results
[4] Stefaan G. Verhulst, Markle Foundation, Introduction to Telecom Reform and Liberalization, August 2003.
[5] Larry Luxner, Modernized Markets: Chile and Argentina take two different paths, Tele.Com
[6] Aileen A. Pisciotta, Global Trends in Privatization and Liberalization (Ch. 23)

Sunday, February 4, 2007

Memo 3: COLD WAR REVISITED !

How often do we see two lions fighting over a potential prey and in the act, the prey actually escapes! This could well be the case with the two consumer electronics giants Sony and Toshiba, in what seems to be a symbolic repetition of history. We are actually staring at a bleak possibility wherein, there might be a situation when you will not be able to borrow your favorite movie disc from your friend as it does not play on your latest hi-tech player! Well, if you are still not in the zone, check this link http://video.google.com/videoplay?docid=2724445957214251800.
We are forced back to the 1980s when Sony and Toshiba were at logger heads with regards to home entertainment video standards in the form of Beta (Sony) and VHS (Toshiba). VHS won, in spite of an apparently 'inferior' technology. Today, the cause (incompatibility of the two developed standards), the objective (consumer electronic market capitalization), the battleground (the home entertainment industry) and the players (Sony and Toshiba) are the same, the only difference is that we are dealing with a more sophisticated battle paraphernalia in the form of next generation video disc standards – HD DVD (Toshiba) and Blu-Ray (Sony). Speculation is, that the outcome could also possibly be the same. (Toshiba with its HD-DVD format, could run away with the trophy).

Technically Speaking!
Sony-Philips and Toshiba-Hitachi, have both developed state of the art video disc format, much more sophisticated than the present DVD. Although both formats use the same laser wavelength (405 nm as compared to DVDs 600 nm approximately) for burning high definition video content on the disc, Blu-Ray boasts of a superior technology, by making use of a “tighter track pitch, hence a finer pickup aperture (0.65) as against HD-DVDs slightly thicker one(0.85).” [1] For more on the technical details see http://www.engadget.com/2005/09/19/blu-ray-vs-hd-dvd-state-of-the-s-union-s-division/These are not just a few technical specifications, as in them lies the crux of the matter.

Postmortem Report!
The problem lies in incompatibility of players supporting the two disc formats. So a choice (with obvious disadvantage of limitation) has to be made. Blu-Ray's superiority lies in the fact that because of the sophistication in the data burn and pickup-head technology, the disc has a theoretical data capacity of 200 GB as against HD-DVDs 60 GB. [2] This might not be very appealing to a movie buff who isn't interested in the statistics, but is happy if he can get more than half a dozen high definition movies on one disc (for which even 60 GB is more than enough). However, we today have a whole new side of entertainment faction (the gaming gangs), for whom each GB of storage is worth an auction. So where does the problem lie? Don't we have a clear winner? Not really, its all about money, honey! Mass production cost of HD-DVD disc and the compatible players is much lesser than Blu-Ray, due to obvious technical reasons. Also, the roll out of HD-DVD into consumer markets, in general, has been a couple of months earlier than Blu-Ray, having gained a bit of temporal advantage. So its a replay of the Beta-VHS case, but this time the fight is much more closer, and Toshiba might not get lucky as before. Both the camps have some deep corporate line ups to back them. In the movie studio category, there are some big names on each side (Blu-Ray: 2oth Century Fox, MGM Studios etc. and HD-DVD: Paramount Pictures, New Line Cinema etc.) and some on both sides (Buena Vista Home Entertainment, Warner Bros. etc.) Similarly, there is a divided camp as far as contributors go (Blu-Ray: Adobe Systems, BenQ, Sony BMG etc. and HD-DVD: the latest controversial inclusion of Microsoft and Intel, Acer Inc., Hitachi Corp. etc.) [3] For more, visit http://www.engadget.com/2005/09/19/blu-ray-vs-hd-dvd-state-of-the-s-union-s-division/The Blu-Ray, at the moment seems a stronger contender in terms of technology and corporate backing, but is up against the price factor advantage of HD-DVD.

Believe me or not!
At first it might seem that consumers are the one who will be victims of this irrational hostility, but I believe that all this build up will be futile as it is very probable that both sides will end up on the loosing side. Technologically, HDTV streaming through upgraded cable networks and fast evolving telecom networks (IPTV) is almost a reality, so do we really want to care about discs? Secondly, the corporate loyalty is ephemeral, driven by speculation and if the alternate technologies get to the floor early, the movie studios will easily bid adieu to their 'loyalty' and adopt a neutral approach. Although it is believed that PS3s coalition will decide the winner (most probably Blu-Ray), market capitalization might well be restricted to the gaming sector, as this will not lure supporters to shift camp and take advantage of a potentially bigger market because of the evident cost factor involved. Although, HD-DVDs (and their players) are cheaper as compared to Blu-Ray, technology (which is of utmost significance today as never before) is one factor no one wants to overlook. With the degree of uncertainty, and fear of pseudo competition, hardware manufacturers surely do not want to risk a mass production investment. Consumers do not want to jump the gun by making some hasty decision of exhibiting an early partisan as this might deprive them of choice.
So, it may seem that both the camps want to heat up the battle and pamper their obstinacy (by an aggressive build up), in an attempt to attain monopolistic stature, they are actually digging deeper holes for their own burial. The stalemate is actually giving other options a chance to make this very standoff void. The market might actually not exist as there is a massive psychological inertia to side either ways and the ideal option (buying and manufacturing a universal player) is far too expensive. As I said earlier, it is possible that the prey might escape both the lions!

Is there a way out?
Just as beauty is nothing without appreciation, technology is nothing without consumerism. If this is understood, there is a possibility of truce. The two standards, in isolation, are never going to be successful in a long term. Just the way a thaw was facilitated to introduce a common DVD standard, there has to be a way sought to end this. Before its too late, they should work at ways to make technology the actual winner and not make this standoff a mere classical conflict analysis case study for the future academias.

Credits: All facts, information and references [1], [2] and [3] obtained from http://reviews.cnet.com/4520-6463_7-6462511-5.html?tag=nav

Monday, January 29, 2007

Article 1: Internet's Openness Threatened!

Today, Network Neutrality debate is a raging issue in the US senate and the telecom circle. Although it is under scrutiny in most of the major international telecom markets (Europe, India, Japan etc.), all eyes are on US as it could well set the tone on a global pedestal.
The gravity of the issue can be comprehended if one makes a comparative study with the scenario in China, where the term 'network neutrality' already seems extinct.
The implications are not only of commercial and legal significance, but are also symbolic of the national ideals of some of the biggest international players (US, Europe, Japan, India, China)

To know what is meant by network neutrality, see this short clip
http://www.savetheinternet.com/

Learn more about the policy contrasts on following links:

Call for an Open Internet!
http://www.commoncause.org/site/pp.asp?c=dkLNK1MQIwG&b=1234951

'The Great Firewall of China'
http://en.wikipedia.org/wiki/Internet_censorship_in_mainland_China

Memo 2: Know the CTO !






Commonwealth Telecommunications Organization (CTO)

Commonwealth Telecommunications Organization (CTO) is one of the fastest evolving international telecommunication organizations in the world today. It basically aims at collaborating efforts of the government and corporate telecommunication departments of commonwealth and non-commonwealth nations, in coordination with their respective business and regulatory institutions, over a multidisciplinary sphere of technology, commerce, policy and socio-economics.



History and Background
“CTO’s presence roots back to 1901, as the Pacific Cable Board”[1]. It was a British undertaking and the work basically involved setup and maintenance of electrical telegraph infrastructure to facilitate limited administrative correspondence. After the independence of several British colonies and the formation of the Commonwealth, “the organization was restructured, in 1967, as an international treaty organization, independent of the Commonwealth Secretariat and with diplomatic status in its host country, the UK”[2]. Since then, it came to be known as the Commonwealth Telecommunications Organization (CTO) and underwent several structural, organizational and operational changes and evolved as one of the major international institutions in the telecommunications domain. Thirty five member nations (eg. UK, South Africa, India, Canada, Australia etcetera) formed the governing council of the organization, as these were the key financers of the organization’s existence and eventual growth[3]. Col. D McMilan (UK) and Mr. S.N. Kalra (India) were the first Chairman and Vice-Chairman, respectively, of the structured organization in 1967. Over a period of time, CTO has grown in its role as a telecommunication technology, business and regulatory entity.

Mission[4]
“CTO’s mission is to :
Offer the highest quality programs for capacity development, knowledge sharing and information services to member countries;
Deepen, expand and diversify the partnerships between governments, businesses and other organizations to reduce global poverty and achieve the Millennium Development Goals for ICT;
Help bridge the digital and knowledge divide especially in the five key sectors of food & agriculture (e-nutrition), education (distance learning), health (telemedicine), e-government and e-commerce;
Facilitate the successful development of telecommunications and other businesses to support social and economic development objectives of governments and civil society. “[4]



The Present Structure
The CTO has a very well defined operational as well as organizational framework. The organizational framework basically can be classified under the legislative and the executive tabs. The former forms the Commonwealth Telecommunications Council and the latter, in charge of the organization’s functioning, forms what is known as the CTO Headquarters.



Schematic conceptualization of CTO’s organizational structure based on the information from http://www.cto.int/



Activities
The primary thrust of CTO’s efforts has been towards development of telecommunications technology with the aim of creating a digital parity among the various developing commonwealth nations. ‘Information and Communication Technologies (ICT)’ are the primary focus of CTO today[5]. It collaborates R&D and training programs in technical, business and regulatory capacities at various levels of conception, with several international telecommunication institutions like the ITU, DFID, USAID and TRASA as well as global and local companies such as BT, Cable and Wireless and Telkom South Africa[6]. “Since 1983, it has been aggressively involved in over 3000 bilateral technical projects, 120 international conferences and numerous scholarship schemes, information resources and other activities of value to the governments and telecommunications businesses in Commonwealth countries”[7].
Through its diverse activities, it has, to a great extent, impacted the socio-economic hue of several Afro-Asian societies and hence has contributed a great deal to the rapid development of the global telecommunication machinery.


Credits:
All information, including the logo obtained from http://www.cto.int/
References
[1],[2],[3],[7]
http://www.cto.int/index.php?dir=03&sd=50 as on 27th January, 2007
[4]
http://www.cto.int/index.php?dir=03&sd=30 as on 27th January, 2007
[5],[6]
http://www.cto.int/index.php?dir=08&sd=00 as on 27th January, 2007