Monday, August 6, 2007
Special Report--Bhabha Atomic Research Centre
Thorium is the word
Often criticized for being covered in a shroud of security and secrecy, Bhabha Atomic Research Centre has been the incubator of many strategically important technologies. Today Anil Kakodkar is focusing BARC on developing thorium technology for power generation
Shivanand Kanavi
When you are finally able to get through the security shielded gates of BARC understandable after Pokhran-II - and enter its labs ensconced in verdant surroundings in Trombay, you are likely to hear only two words - Thorium and mum. Ask director Anil Kakodkar or any senior scientist about the future of BARC and they will say, "Thorium". But if you want to know anything about Pokhran-I1, thermonuclear bombs, nuclear submarines, and so on, be warned - mum's the word.
A band of highly motivated scientists and engineers went about building Apsara, the first nuclear reactor in Asia, more than 45 years ago at the Atomic Energy Establishment, Trombay. Today they have converted that marshland into a veritable storehouse of science and technology. "As far as scientific and engineering expertise goes, I think we have a goldmine here," says Anil Kakodkar. A 56-year-old nuclear engineer, Kakodkar threw away several offers from the private sector after his graduation from VJTI, and joined the atomic energy establishment against the wishes of his friends and relatives, but he has not looked back since then.
"Every day here, is a challenge for an engineer or a scientist. We have vastly grown since those early days and today have over 4,500 scientists and engineers on this campus and about 10,000 technicians and support staff. Naturally the informality that existed then is difficult to maintain, but as for scientific dissent, we thrive on it. In fact, if we had all the division heads here for this discussion on BARC we may not have a very peaceful session!" he says.
Is the money spent on BARC commensurate with its output? One observer who preferred to remain anonymous said: "Half of India's R&D budget has gone into atomic energy. Is that justified?" This is a question often asked both by the lay public and in scientific circles, though it has been muted after the five nuclear explosions in Pokharan in May 1998. For most people BARC has always meant the bomb, but why are scientific circles envious of BARC? The answer, naturally, is money. When research funding in India has been meagre the fight for a share of the pie becomes intense.
Indian universities are starved of research funds. Even leading universities do not have any. Many top-ranking universities have cut down the number of research journals they used to subscribe to for want of money. As for modernising labs and other infrastructure, the less said the better. Increasingly, they are being asked to raise funds from non-governmental sources, primarily industry and alumni. Even the blue-eyed boys of higher education and research in the IITS found it traumatic when they were told that government funds were no longer available.
However, the IITS have been lucky. Criticised for their flight to North America, it is these very alumni that are coming to their aid. The IITS are increasingly tapping them for funds and are having some success. Kanwal Rekhi, a successful entrepreneur in the Silicon Valley, is in fact confident that nearly $500 million can be raised from IIT alumni worldwide if the idea is marketed properly. University Department of Chemical Technology at Mumbai has pioneered non-governmental fundraising through industrial consultancy and donations from alumni. Of course, it helps when UDCT'S alumni practically run India's chemical industry. But one swallow does not a summer make.
The 44 national laboratories of CSIR which form the largest chain of publicly-funded R&D labs in the world have an annual budget only twice that of BARC! However, CSIR saw the writing on the wall in the late 1980s and, in the past five years, under R.A. Mashelkar's leadership, has implemented a vigorous programme of innovation and technology marketing worldwide. This is making a positive impact on R&D in India in terms of a trend, though the large numbers have yet to come. In such a situation, would you grudge a scientist outside atomic energy and space a little bit of heartburn?
So what has been the outcome of R&D at BARC? Strategically it is clear that the single most important contribution to India's nuclearisation programme has come from BARC. Its founders were way ahead of their times and invested in a small way in many technologies that have proved strategically indispensable. For example, reprocessing spent fuel from power reactors and some research reactors leads to recovery of fissile material like plutonium, which can be used to either build fission bombs or fuel other power reactors. It is obviously a dual-use technology.
Research into reprocessing started way back in 1963-64, much before any fuel needed reprocessing. In fact, India is one of the very few countries today which has this complex chemical technology. This work has two aspects. One, obviously, was Pokharan-I in 1974. But it also led to reprocessing at an industrial level, so that today reprocessing plants at Trombay, Tarapur, and Kalpakkam are operating with BARC technology. The simultaneous work on fast breeder reactors with French help has led to important cumulative experience in this technology so that a new Indira Gandhi Centre for Atomic Research has been built at Kalpakkam near Chennai to develop this further towards building a 500 MW-prototype fast breeder reactor. Such a reactor will produce more fissile material than it consumes, roughly in a ration of 1:1.2, and hence the name 'breeder'.
Similarly, the unit in BARC which assembled instruments for controls at Apsara and later CIRUS (Canada-India Research Reactor) was spun off as Electronic Corporation of India (ECIL), which produces control instrumentation not only for all the reactors but also for several defence projects. The work done on heavy water production using a new hydrogen sulphide process has been industrialised under the Heavy Water Board, which runs several plants that produce heavy water for the power reactors.
A small group at BARC known as the Atomic Fuel Division took on the challenge of fabricating fuel for the Apsara, CIRUS, and Tarapur plants despite the fact that the original equipment suppliers from the UK, Canada, and the US were ready to supply it themselves. This work led to the large industrial Nuclear Fuel Complex at Hyderabad that fabricates all the fuel elements required for the power reactors. The isotope division, which used research reactors like CIRUS and, later, Dhruva to produce over a hundred radioactive isotopes for medical and other applications, has spun off another industrial unit, BRIT (Board for Radiation and Isotope Technology). Today over 150 hospitals practice nuclear medicine and about 500 laboratories use radio-immuno-assay techniques. Nearly a million patients a year in India are investigated using radio isotope techniques developed at BARC, with isotopes made industrially by BARC. Moreover, 350 commercial organisations have sprung up to service the need for isotope radiography used in non-destructive testing of industrial plants.
The nuclear power plants themselves require a lot of R&D work for operation and maintenance. Some of it involves advanced reverse engineering in adverse circumstance when you do not even possess the drawings of spares, as in the case of Tarapur Atomic Power plant, in other cases it involves innovative work to keep the power running. For example, Kakodkar is proud of the work done to get the reactors at Chennai on line when they were about to be shut down for good because of a moderator manifold collapse in the heart of the reactor. Similarly, when the coolant channels in the two Rajasthan reactors had to be replaced, BARC developed the expertise and the robotics required - a highly complex engineering challenge. Till then only Canada had the technology, but the BARC-NPC technology was much cheaper and accomplished the task in less time and well within the radiation exposure limits.
"If you want to discuss the commercial or industrial applications of R&D carried out at BARC then these are some major ones. Our mandate has been to develop the technology of applying nuclear energy for power and other purposes. It is clear that BARC has been the mother institution for the entire nuclear industry in India," says Kakodkar. "The monies earned through transferring some of the spinoff technologies like the enzyme-based process to manufacture invert sugar, used heavily in biscuits and the confectionery industry, or particle size analysers are used in the pharmaceutical industry, etc, are incidental. Unlike CSIR laboratories, which were set up to develop processes and technologies for existing industry and earn money through royalties and licensing, were set up to give birth to many industries which did not even exist," says A.K. Anand, director of the Reactor Projects Group, who is in charge of technology transfer and international relations.
"For example, the expertise in robotics developed at BARC under M.S. Ramkumar's leadership is of a very high standard. We had to develop it to build an online fuelling machine for the power reactors and then a coolant channel inspection system for the same.Just then Indian Oil Corporation, which owns over 6,000 km of overland cross-country pipelines, tapped this expertise. Thus came into being the Instrumented Pipeline Inspections Gauge (IPIG), which will soon be tested on the Patna-Barauni pipeline. Since IOC pays over a Rs11akh per kilometre for such an inspection to foreign companies who hold proprietary technologies, the development of IPIG is very welcome," says Kakodkar. "Similarly, several groups started working on parallel processing in the early 1990s in India when the Cray XMP computer was denied to us and even the purchase of the Cray XMP by the meteorological department had several humiliating conditions attached. Our supercomputer group produced Anupam, which has reached 1.3 gigaflop speed and is the only Indian supercomputer that is successfully running the weather modelling programme," he adds.
Being such a hi-tech centre, isn't BARC concerned about intellectual property rights (IPR)? "We are slowly becoming aware of IPR in the case of non-nuclear technologies and are preparing to protect some of our innovations. As far as strategic technologies are concerned the issue does not arise. If somebody is ready to licence these technologies for a fee there is no problem. We can then say there is a free market for technology. But if such technologies are brought under sanctions and embargoes, where is the issue of patents?" asks Kakodkar, warming up to the subject of IPR.
"In the 1950sand 1960s, when there was hardly any high-technology infrastructure in the country. But now that there are the IITS, the CSIR labs, etc, is it necessary to do everything under one roof and that too such diverse things as biotechnology, lasers, and parallel processing?" ask some critics. Kakodkar says things are changing. He believes in networking and that is why there is an increasing emphasis on partnering with universities, the IITS, and the CSIR labs. At the same time he claims that only four engineers worked on Anupam and that, some of the biotechnology was a by-product of work being done with nuclear applications in mind. "We have a simple guideline to approve projects - they have to be relevant or excellent. That gives a general focus to the work. The specific focus, of course, is thorium, while work on isotope technology will continue," he says.
What is thorium technology and what is its relevance to India? "India has a limited supply of uranium (estimated reserves are only 78,000 tonnes) as against 518,000 tonnes of thorium. Therefore, to achieve long-term energy security, it is imperative to develop technology for large scale electricity generation using thorium," says R.K. Sinha, head of the reactor engineering group. (see box)
While the reactor engineering group is busy setting up critical facilities for advanced heavy water reactors that will use large amounts of thorium as fuel, K. Balu and his group at the nuclear recycle group are already studying reprocessing and waste treatment for thorium. Balu's group is credited with having developed the vitrification technology that will immobilise highly radioactive nuclear waste in a glasslike structure so that the waste will not leach out. This glass will be further covered by two stainless steel jackets and then lowered into a thick concrete vault built into basaltic rock. The site is chosen so that there is very little seismicity in the area and no fissures that carry groundwater. "The technology is there, though it will be used several decades later," says Balu.
Thus the whole cycle of making fuel bundles, designing reactors that will burn thorium, reprocessing the spent fuel, and disposing of waste are being worked on today with thorium as the centre. "Just as the technology that we are now using in power reactors was developed about 30 years ago, we need to start developing technologies that will be used 30 years hence," says Kakodkar.
Pokhran fallout
"A major management technique that Kakodkar is associated with at BARC is the formation of multi-disciplinary task forces," says A.P. Jayaraman, a senior scientist who now heads the public awareness division. "We have at least 20 major task forces operating today," says Kakodkar. "The composition is purely need-based and not hierarchical. These groups also work in a very transparent way - nobody can hide behind technical jargon to explain why he did not fulfill his task"
Like all hi-tech organisations in the country, BARC is losing up to 30 per cent of its young scientists working in computers and electronics within five years of their joining. But Kakodkar points out that this is happening in the IT industry itself. Pokhran-II, of course, has helped attract young people to BARC. Recently, when he went to deliver the convocation address at IIT Madras, the generally self-effacing Kakodkar was faced with hordes of IIT graduates asking for his autograph. "For the first time in my life," he says.
Did Pokhran-II create butterflies in his belly? "Surprisingly, no. In fact the only time I remember spending a sleepless night was when, as a young engineer at BARC 30 years ago, I went ahead with designing and putting together a high-pressure, high¬ temperature loop beg, borrow, or steal. The day before it was to be tested I could not sleep as I had not listened to the traditional wisdom of some of my senior colleagues and had done what I thought was right. But the next day it worked." Kakodkar still carries the courage of his convictions, but has grown wise enough to carry his junior and senior colleagues with him. At BARC undoubtedly there is a spring in everybody's step. Pokhran has contributed to it in no small measure and Kakodkar's leadership no less.
BARC budget
Rs crore
1996-97 1997-98 1998-99 1999-2000
Revenue
(salaries, Consumables) 220 290 330 340
Capital ex (new assets) 42 57 103 200
Unlocking thorium secrets
India would be a leader in nuclear technology if it develops the thorium cycle for power.
Thorium has several advantages over uranium.
• Worldwide thorium deposits are three times more than that of uranium. In India’s case it is nearly seven times.
• Thorium is a more fertile material than natural uranium, i.e. there will be a larger percentage of thorium 232 converted to fissile uranium 233 than uranium 238 converted to plutonium 239 in the existing pressurised heavy water reactors.
• Thorium is a better conductor of heat and that makes the fuel bundles last longer in the reactor without significant deterioration.
• Long-lived radioactive by-products (actinides) which create waste disposal problems are produced in much less quantity in the thorium fuel cycle than with uranium.
BARC today is concentrating on all elements of a thorium fuel cycle, from fuel fabrication to reprocessing and extraction of uranium 233 while avoiding the complications posed by the highly radioactive U232, and then disposing of the waste produced during the thorium cycle. For experimental purposes thorium is already being loaded into existing power reactors. This has not only helped in power-flattening in the core of the reactor but also provided some quantity of U233. Exposing thorium to neutrons in the Dhruva research reactor has also generated small quantities of U233. A new research reactor Kamini has been built using U233 and thorium.
The whole three-stage nuclear programme might take considerable time for both technological and financial reasons. For example, according to the original plan, 10,000 MW was supposed to be produced by 2000. However, after the Nuclear Power Corporation placed orders with equipment suppliers for advanced procurement and so on, funding was withdrawn by the Central government. That left both NPC and Indian industry involved in hi-tech nuclear fabrication high and dry.
Business India is witness to the fact that two 500MW reactors, which will now be erected in Tarapur around 2004, were already fabricated and lying ready in 1993 at BHEL, Walchandnagar, and L&T! Thus the biggest brakes on India’s nuclear power programme have been the planners in Delhi rather than the Department of Atomic Energy.
In such a situation Kakodkar and his team at BARC have come up with an innovative intermediate solution called the advanced heavy water reactor (AHWR). This technology is highly competitive compared to the existing technologies in several ways:
• Instead of heavy water, ordinary water is used as a coolant.
• The complexity of steam generation is greatly reduced, thereby reducing delivery time.
• Natural convection used in safety systems reduces the capital costs considerably.
• It is thermally more efficient due to the use of moderator heat in preheating feed water.
• Coolant channels can be constructed on an assembly line, thereby reducing construction cost and time.
• It is safer than existing reactor technologies.
The additional advantage of AHWRs will be the use of a large amount of thorium in the fuel. However, since nobody in the world yet possesses thorium technology, BARC’s efforts today will start having positive economic effect in 2020. Considering that the technologies being used industrially today for power production were actually worked on 30 years back that is really investing in the future.
Radiation with a heart
The word 'radiation' conjures up images of the deformed bodies after Hiroshima. However, radiation can be lifesaving as well. Besides the well-known gamma irradiation of tumours using Cobalt-60 units that are supplied all over India by BARC, the centre's scientists have also come to the rescue of cardiac surgeons. One of the techniques used to save cardiovascular patients suffering from choked arteries is angioplasty.
Simply put, the surgeon sends a tiny balloon into a choked artery and inflates it at the right place. The additional internal pressure thus expands the blood vessel, facilitating the flow. To make sure that the arterial walls do not collapse, surgeons insert tiny metallic coils called stents within the blood vessel. However, these stents can cause tiny injuries to the walls and when these injuries heal the scar tissue can choke up the vessel again.
BARC scientists led by S.M. Rao and his team at the isotope division came up with a solution for this fatal problem. They coated these stents with tiny amounts of radioactive phosphorus so that wounds caused by stents are cauterised in a short time, preventing scar formation and saving the patient's life. Already 30-40 such implants have been carried out by surgeons on Mumbai with a very good success rate. Currently multicentric trials of this technique are being carried out and, if successful, will give patients undergoing angioplasty a new lease of life.
Nuclear medu wada
Hardly any body outside of BARC or its nuclear agriculture division and a few agriculture universities might know that 95 per cent of urad dal (black gram) grown in Maharashtra is a BARC product. Urad, a pulse whose flour is the main ingredient of medu wada, a popular south Indian snack, is produced with varieties developed by genetically altering conventional breeds through irradiation. The variety TAU-1 has led to an increase of yield per hectare by 29 percent.
Similarly, a popular mustard variety grown in Assam is another BARC product. The widely exported large-sized groundnut is another BARC product Trombay Groundnut (TG-1). Today more and more varieties of ground nuts, soya beans, moong dal, and tur dal, are produced with higher yields, pest-resistance, and other desirable qualities.
Plant breeders and farmers depend on the genetic variability available in nature for cross-breeding and developing new breeds. The former is the result of .spontaneous mutations. However mutations can be induced artificially to enhance variability manifold. One of the most efficient methods of changing plant genes (mutagenesis) is exposing the seeds to neutrons or gamma radiation. The irradiated seeds are deeply studied to understand the effect brought about by irradiation and it has been found that best results are obtained when these modified seeds are further used in cross-breeding. BARC has been working in this field since the early 1970s and the result is a little-known but significant contribution to increased food production.
Indian Nuclear Industry
The Nuclear Fallout
With the nuclear power programme facing a serious resource crunch, industries will have to explore new options for using their nuclear-related skills.
Shivanand Kanavi
When we talk of nuclear power we talk about its economic viability, environmental hazards, fears of radiation leakage, waste disposal, or even problems regarding closing down the reactor after its useful life. But the other spin-offs to our economy - in terms of scientific-technical manpower, engineering skills and capacities, not to talk about the bottom lines and business turnovers - have not been studied in any detail.
These spin-offs have been varied. Since the 1960s, when India started generating electricity using nuclear power, a host of industries have sprung up in heavy engineering, fabrication, and construction. All these owe their entire development of skills, quality consciousness, confidence to tackle bigger and bigger problems (in size as well as in technological levels), to their participation in the indigenous nuclear power programme.
Anyone who does not know the abysmal condition of our laboratories and universities in the 1940s, and even our engineering industry in the 1960s and early 1970s, cannot easily appreciate the spin offs that have occurred due to the nuclear programme. M.S. Krishnamurthy, joint general manager, of the engineering giant, Larsen and Toubro, who has been associated with the nuclear program for over 25 years, says, "Without the push given by the nuclear power programme we would not be able to do what we are capable of doing today. In the pre-nuclear era, we used to make some equipment for dairies and small cement plants, that weighed a couple of tones. Today, we have moved into the third generation of heavier precision engineering at Hazira that can fabricate components weighing up to 450 tonnes."
This technological advantage works out in other areas as well. For P.J. Bhounsule, sales development manager, L&T (an IIT graduate who has worked on nuclear projects for nearly two decades), the engineering challenges they encountered while catering to their nuclear commitment were of the toughest variety. "One of the toughest assignments we faced was the welding of the two halves of the half-a-metre thick steel disk, that was the deck plate of the Dhruva reactor," says Bhounsule. "The weld had to be so perfect that even the tiny atoms of helium couldn't leak through. Simple heating of the two lips in the joint, led to unequal expansion along the diameter and circumference of the half disks, leading to gaps between the lips of the joint. We had not calculated the different heat sink characteristics. This led us to use computer simulation for the first time."
An analysis of the results revealed that the problem could be solved if the disks were thermally insulated and heat provided at twenty-five distributed points all over. "Finally, we machined channels into the lips so that they could lock into each other and after careful deep welding from both sides of the disk, we got the defect-free weld," claims Bhounsule proudly.
This precision and problem-solving capacity that they have acquired is what all the industries associated with nuclear technology praise. T.S. Sakethan, general manager, special products division, Walchandnagar Industries (WIL), proudly shows his hi-tech dust-free shop floor, ingeniously assembled right in the midst of the cranes and fork lifts. He points out a welder meticulously welding the tubes to a tube sheet in a heavy water heat exchanger. The Welds have to be totally defect free," he says. "Normal methods of non-destructive testing (NDT) like sonography, radiography, dye penetration, and magnetic particle patterns cannot be used here, so we do statistical quality analysis. The welder has to be trained in the technique for months together and pass all sorts of tests."
But even this is not enough. The welder's skill is constantly checked out, since there is little or no room for error. "Every day before he starts work, he has to weld a few samples, which are then physically sawed off and tested for defects," says Sakethan. "Only when the samples show zero defect is he allowed to touch the job that day." This may sound unnecessarily timeconsuming but with the risks of nuclear leaks taking precedence over all else, it's a necessary precaution.
One corollary to this kind of nit-pickety precision is that customers of nuclear manufacturers are positive that they will get quality that's of the best kind. P.J. Bhounsule of L&T says, "The philosophy of quality control had to be changed from post manufacture checks to planned quality assurance, systematic definition of manufacturing procedures and documentation. All these have helped us obtain authorisation to use various quality stamps of the American Society of Mechanical Engineers and the ISO 9001 certification. "
M.L. Mitra, director, environment and public awareness, Nuclear Power Corporation, who was deeply involved in the handholding operations in the early years, recalls, "We had to convince many in the industry that quality does not mean higher cost but lower project cost."
As the confidence in their technical abilities and quality grew, the industries were able to take on more challenging tasks. Currently, nuclear manufacture involves the standardised design of the 235 MW reactor, the consolidation of infrastructure and manufacture using the convoy system, cutting project time, the design and manufacture of 500 MW reactors for Tarapur III and IV and Rajasthan III and IV. The industries have also built components for the heavy water projects and the Fast Breeder Test Reactor. Now, the pool-type Prototype Fast Breeder Reactor to generate 500 MW, using liquid sodium, has been designed and the industry will participate in its fabrication as well.
But perhaps the best spin-offs to these nuclear-affiliated industries have been in terms of turnover. L&T alone has done Rs.312 crore of nuclear work. Bharat Heavy Electricals, which has gained the maximum benefit, has made over Rs.800 crore. Most of the business is pure profit in the industry only has to pay for labour costs, as the raw materials are provided by the DAE and the NPC.
Besides its contribution to corporate bottom-lines, what have been the spin-offs in terms of new business? "With our expertise, if not on a turnkey basis, at least as critical component manufacturers, we can get contracts from multinationals who want to set up industries in India," says T.V. Rudrappa, general manager, quality assurance, WIL.
R.D. Hariani, technical director, GR Engineering. concurs, "Association with the Nuclear Power Corporation has helped us indirectly in getting jobs in other sectors as the quality has been upgraded in an overall sense." Krishan Kumar, general manager of the public sector giant, Bharat Heavy Electricals, is equally upbeat regarding spin-offs, "BHEL has gained considerably technologically through its association with nuclear power. Now, we are in a position to execute the conventional side of the nuclear power plant on a turnkey basis." After the recent fire in the generator in Narora I the turbine generator that was based on GE design is also being redesigned for Indian conditions by BHEL and NPC.
With these design modifications Indian Nuclear-related industries have finally come into their own. They have moved from their total dependence on foreign designs, to making design changes, to finally conceptualising and manufacturing their own designs. K.R. Balakrishnan, general manager, control panels, GEC Alsthom India. Ltd, who have supplied' over Rs.15 crore worth of control protection equipment and switch gear to all the reactors, says unequivocally that association with NPC projects has helped them acquire experience in designing and manufacturing equipment suitable for an earthquake-prone environment.
K.K. Sinha, chairman and managing director, Mishra Dhatu Nigam (Midhani), a PSU set up to develop super alloys, is proud that hundreds of tonnes of very special steel called grade 403 (which is a medium carbon steel but whose composition is controlled within a very narrow range) were produced by Midhani. Similarly, another copper niobium special steel, called 17-4 PH grade, was also developed and produced by Midhani for the nuclear reactor components using electro slag refining and vacuum arc furnaces. Not many countries in the world have these capabilities, says Sinha proudly.
Where to, from here? With the resource crunch threatening India's own nuclear programme options, the logical next step would have been to export the technology. But the government has given very little thought to going into the global nuclear business, although Japan and South Korea are feverishly building nuclear power stations. Besides this, there may be a number of developing countries that will go in for the smaller 235 MW PHWR if the fuel supply can be arranged. Indian expertise in building research reactors had been sought world wide, but India did not pursue it.
The real test of our nuclear industry will come in delivering systems and components on schedule for international clients. And in the ultimate analysis, the industry will be able to use the skills it has acquired in other fields. For although the nuclear industry is facing a serious resource crunch, the resourceful among them will turn this adversity into opportunity.
Thursday, August 2, 2007
Light Emitting Diodes
Business India, December 19, 2005-January 1, 2006
Chips of light
LEDs convert electricity much more efficiently into light than say incandescent bulbs or fluorescent lamps
Shivanand Kanavi
Can semiconductor chips, which have revolutionized the way we live, give us light? Yes they can. Such chips for lighting are not made of silicon, which is used in electronics but more complex semiconductors, made of alloys of gallium, indium, arsenic, nitrogen, aluminum, phosphorous etc. They are fast becoming the coolest new technology in lighting.
It has been known since the turn of the century that some semiconductors emit light when a current is passed through them. However it has taken almost a hundred years for the technology to do it efficiently and inexpensively. Most of these semiconductors are what are called direct band gap semiconductors and they have led to the development of semiconductor lasers as well. Inexpensive semiconductor lasers drive your CD player, DVD player or even a laser pointer used during a presentation or even your TV’s remote control.
Semiconductor lasers are also extensively used in high speed data communication from the run-of-the-mill office computer networks called LANs(Local Area Networks) to mighty submarine fibre optic cable networks, like the ones acquired recently by VSNL (Tyco) and Reliance (Flag).
The discovery and perfection of direct conversion of electricity into light has also led to the reverse that is the development of more efficient solar panels to convert light into electricity.
The diodes, which emit light when they are conducting an electrical current, are called Light Emitting Diodes or LEDs. They are already becoming quite popular as Diwali or Christmas lights and in traffic signals. Those green and red light dots that indicate whether the device is active or in sleep mode in your digital camera, camcorder, DVD player and TV are also LEDs.
Compound semiconductors are considered the country cousins of the more flamboyant silicon chips that power our computers, cell phones and all electronics. However, without much ado their optical applications are increasing manifold in every day life.
The first bright LEDs to be invented were emitting red light and later orange and yellow. However attempts at producing green and blue LEDs were not very successful till a Japanese scientist Shuji Nakamura invented a bright blue LED and later white LED in the mid 90s. Nakamura’s work brightened up the whole field and intense activity ensued leading to fast growth. He worked hard with very little funding and repeated disillusionment for several years to come up with the blue LEDs. The company he worked for at that time, Nichia is today one of the world leaders in blue and white LEDs and lasers. A few years back he moved out of Nichia and is currently a faculty member in the University of California at Santa Barbara. While Nakamura works in optical properties of Gallium Nitride and other compound semiconductors his colleague Umesh Mishra researches into the electronic properties of Gallium Nitride to produce high powered transistors for cell phone companies and the US Defence Department. If successful Mishra’s Gallium Nitride transistors will replace the vacuum tubes from their last refuge—high power microwave systems in Radar and communication networks. Together Nakamura and Mishra have built up a formidable team of cutting edge researchers in Gallium Nitride at Santa Barbara.
Yes, all you Baywatch junkies, they also do serious science off the sands of Santa Barbara.
On a more serious note, the technology is evolving rapidly and in the next five years might revolutionise lighting. LEDs for lighting purposes have many advantages. They convert electricity much more efficiently into light than say incandescent bulbs or fluorescent lamps. In fact 90% of the energy is wasted in incandescent bulbs as heat. They also last much longer—upto 100,000 hours. That is more than 12 years of continuous operation! Where as in the case of incandescent lamps it is of the order of 1000 hours and in the case of fluorescent lamps it is of the order of 10,000 hours. They also consume much less electricity hence your batteries in a LED flashlight for example, seem to go on forever. That is ideal if you are in a remote area on your own as in camping, trekking or even a natural disaster. For example Pervaiz Lodhie a Pakistani entrepreneur in Southern California dispatched over 2000 solar powered LED flashlights to Kashmir soon after the earthquake hit the inaccessible Himalayan region. Last year his firm had also sent such items to South East Asia after the killer Tsunami hit the area.
What are the weaknesses of this promising lighting technology in an increasingly energy starved world? Primarily three. One the brightness that is measured in Lumens per Watt of electrical power is still nowhere near the standard required for high brightness lighting. Two, the products are still expensive. For example a decent flashlight costs around $15-40. Thirdly the light is extremely bright in one direction hence a LED light directed towards your work bench or a flashlight works well but if you try to light up your room with it then you end up using too many LEDs.
The US Department of Defence and the Department of Energy are heavily funding research into semiconductors to come up with high power lighting and electronics. As a result the developments are feverish in this field.
Recently, the venerable General Electric, a company that was founded by Thomas Edison to sell the light bulbs he invented, has announced Organic Light Emitting Diodes. In layman’s terms, soon there will be inexpensive plastic sheets, which will light up panels and curved surfaces. Cree Research Inc. a Nasdaq listed leader of LED chips, has produced very bright LEDs (more than 90 Lumens per Watt) two years ahead of industry’s expectations.
“A much less fashionable but critical area to work in, is encapsulation of LEDs” says Rajan Pillai of Nanocrystal Lighting Corporation, a research based start-up from New York. He is referring to the fact that the semiconductor chip is surrounded by a transparent lens capsule which act as a protective cover as well as an out let for light. All LEDs emit light of only one colour. In order to generate white light one introduces substances called phosphors into this casing. These phosphors then absorb the original light from the LED and emit light of different colours. An appropriate phosphor would thus create green light from a blue LED or white light from blue LED etc. Thus, if the phosphor can be improved, then the brightness of the led can be improved. Pillai claims that the new phosphors invented in Nanocrystals Lighting Corporation are smaller than the wavelength of light and hence invisible and that they can increase the brightness by about 20%.
You know when a technology has moved out of the lab and VC firms and into the market place, when you find the products on the Christmas shopping lists of visitors at Walmart and other retail chains. That is what LED flashlights have just achieved this holiday season, just as digital cameras and iPods did earlier.
Perception and Technology
Psy-Tech
Can the soft sciences combine with hard technology to produce winners?
Shivanand Kanavi
The word ‘technology’ immediately conjures up in our mind, machines, number crunching or in IT jargon algorithms. Conventional wisdom says that to go up in the technology ladder we need to hone our mathematical skills, analytical skills and the engineer’s practical problem solving skills. So what is this newfangled Psy-Tech? Is it ESP, psycho-kinesis or a pearl of wisdom from Spock—the one with serious face and pointed ears in StarTrek? Or is it something brewed and marketed by Deepak Chopra to gullibles in Mumbai and Malibu?
No. Psy-tech is nothing as fashionable as that. It is a fact that hard sciences and liberal arts rule different worlds, of objectivity and subjectivity, and eye each other with great suspicion. However many technologies have to marry the two to create successful products. Thereby giving rise to psy-tech.
In the world of technology there is nothing new in what I am saying. The Internet, PC and Artifical Intelligence are all a product of psy-tech. J C R Licklider, left MIT to head the Information Processing Technology Office of the Advanced Research Projects Agency, (ARPA), attached to the US government’s Defence Department in the early sixties. He funded and brought together a computer science community in the US in the early 1960s. He also encouraged the development of computer science departments for the first time at Carnegie Mellon, MIT, Stanford and the University of California at Berkeley. This visionary was not a computer scientist but a psychologist. Over forty years ago he championed the need for interactive computing and PC and his ideas drove the creation of ARPANET the first computer network in the late 60s. ARPANET eventually led to the Internet.
In a classic 1960 paper, “Man-Computer Symbiosis”, Licklider wrote, “Living together in intimate association, or even close union, of two dissimilar organisms is called symbiosis. Present day computers are designed primarily to solve pre-formulated problems, or to process data according to predetermined procedures. All alternatives must be foreseen in advance. If an unforeseen alternative arises, the whole procedure comes to a halt.
“If the user can think his problem through in advance, symbiotic association with a computing machine is not necessary. However, many problems that can be thought through in advance are very difficult to think through in advance. They would be easier to solve and they can be solved faster, through an intuitively guided trial and error procedure in which the computer cooperated, showing flaws in the solution.”
“When I read Lick’s paper ‘Man-Computer symbiosis’ in 1960, it greatly influenced my own thinking. This was it,” says Bob Taylor, now retired to the woods of the San Francisco Bay Area. Taylor worked as Licklider’s assistant at ARPA and brought computer networks into being for the first time, through the Arpanet. After he left Arpa, Taylor was recruited by Xerox to set up the computing group at the Palo Alto Research Centre, the famous Xerox Parc, which became the cradle of PC, Windows, Mac, Ethernet and local area networks, laser printer, mouse and so on. No other group can claim to have contributed so much to the future of personal computing.
Another shining example of cross-pollination between liberal arts and science is Herbert Simon, who was a political scientist and a psychologist. He created the first computer based Artificial Intelligence programme at Carnegie Mellon University and is truly considered one of the founders of Artificial Intelligence. Simon received the Turing Award, considered the Nobel Prize in Computer Science in 1975 and later went on win the Nobel in Economics as well in 1978 for his theory of ‘Bounded Rationality.’
These visionaries approached technology from a psychology background. What about engineers who approached psychology to come up with better products? I can think of at least three such and all of Indian origin. The first and the most well known globally is Amar Bose, chairman Bose Corp. Bose finished his PhD with Norbert Wiener, at MIT in 1957. He received a Fulbright Scholarship to spend a year in India. He used it to lecture at the Indian Statistical Institute, Calcutta where P C Mahalnobis was heading it and at the National Physical Laboratory, Delhi headed by K S Krishnan.
While waiting to sail off to India, Bose had bought a HiFi (High Fidelity) audio system, the hottest thing then. Bose had repaired radios at his father’s workshop in Philadelphia since childhood and knew the system inside out. However he found that the sound produced by the system out of the speakers was far from HiFi. As a classical music lover and a violinist himself, Bose could not bear it. This led him to study acoustics by the night during his sojourn of India. He was intrigued by the fact that the speakers, even when they actually adhered to the technical specifications printed in company catalogues, were not producing music as it was heard in a concert hall. At a very early stage, with a stroke of a genius, Bose realized that improvements in circuitry were not the only key to better audio. He decided to venture into the budding field of psycho-acoustics pioneered at Bell Labs in the 30s. Psycho-acoustics deals with the perception of sound by human beings rather than the physical measurement of sound. MIT allowed Bose, then a very popular and a very unconventional teacher of electrical engineering, to set up his company while continuing to teach at his alma mater. After years of painstaking experimentation, it resulted in the revolutionary Bose speakers. To the surprise of all audio experts, they did not have the familiar woofers and tweeters for the low and high frequency sounds and in fact directed almost 90% of the sound away from the audience! In fact a top honcho at a well known Japanese consumer electronics company, told Bose that they never took Bose seriously, since they thought he was nuts! Of course the tables turned later and today Bose is considered the most valuable audio brand globally.
The second example is that of N Jayant, Executive Director of the Georgia Centers for Advanced Telecommunications Technology (GCATT). A PhD student of B S Ramakrishna, a pioneering teacher in acoustics and signal processing at the Indian Institute of Science, Bangalore, Jayant joined the Bell Labs in 1968. The focus in communication then was how to get good quality voice signals at low bit rates. Those were the early years of digital signal processing. Normally one would require 64 kbits of bandwidth but can it be done at much lower bandwidths that are encountered in wireless and mobile situations? Among others, the US military was keen on developing low bit rate technology. The mathematicians and engineers came up with innovative coding and compression techniques to send as much data as possible in as thin a bandwidth as possible. However if one wanted good quality sound, one could not go lower than 32 kbits. Bishnu Atal another alumnus of Indian Institute of Science working in the Bell Labs came up with his Linear Predictive Coding techniques that allowed telephonic conversations at 16 kbits using a very unconventional approach and in fact a version of his method is used in all cell phones the world over. But we can discuss Atal’s fascinating story at another time.
Going back to digital music on which primarily Jayant was working, Jayant too discovered that pure mathematical and algorithmic approach had limitations and instead adopted a perceptual approach. This led to major study of the frequency components actually heard by the human ear. They discovered that if a sound at any point in time had a thousand frequencies then the ear was sensitive to only a hundred of them. That is 900 (90%) of the components of a sound could be thrown away without affecting the sound heard by the human ear. If the sound is sampled into 1000 frequencies every 100th of a second then one could figure out which 900 of them could be thrown away. All that one needed was processing power that was fast enough, which became possible in the late eighties and early nineties with developments in chip technology. It is this approach that led to MPEG-1, MPEG-2 and the now hugely popular MP3. We all know that MP3 technology has made digital music industry possible. Once again perceptual studies provided the break through.
While Bose and Jayant have seen their studies leading to consumer products soon enough, in the case of Arun Netravali it has taken nearly three decades of waiting. Netravali joined Rice University for a PhD in application of mathematics in communications soon after his graduation from IIT Bombay in 1967. After his PhD however he found US enveloped in a recession post Oil-shock of 1973. With no jobs available in industry, he found an offer from the venerable Bell Labs, most welcome. He was asked to work in the video signal-processing group. Those days the hot thing that was being discussed was the “Picture Phone”, where the speakers can see each other. Obviously it was an idea whose time came three decades later through video conferencing and 3G mobile phones. But in the seventies soon after putting a man on the moon, everything seemed possible, at least to engineers in the US.
Once again the main obstacle for sending pictures and video through a wire was limited bandwidth. A TV signal requires 70 MB bandwidth where as the good old copper wire networks of AT&T offered only a thousandth of it. Once again all sorts of ingenious techniques were thought up by engineers to assist in compressing the video signal. If the subject of the image (say the head and neck of the speaker on the other side) is not moving very fast then one could assume that in the next frame of the picture being sent the image would have changed very little. So instead of sending the whole image again one could send just the difference from the previous one. Going further if the subject’s motion can be reasonable predicted (says the head moving from side to side in an arc of a circle) then one could calculate the possible position of the image in the next frame, send that information and the difference between the calculated and the actual and so on. These are called adaptive differential coding in the jargon of digital communication engineers. But all these ingenuity had limited use since the amount of compression needed was huge.
Then once again perceptual studies came to the rescue of Netravali. Which colours are the human eyes sensitive to? If a lady is sitting on a lawn and you are sending that picture across then what elements of that picture are more important that others? For example the grass in the lawn may not bee noticed in detail b the viewer other than its green colour where as the face and the dress worn by the lady may be noticed immediately. Then why not send the relevant parts of the picture in greater detail and the others in broad strokes? Can patterns in an image be recognized by the sender and just a predetermined number be sent to denote a pattern rather than the whole pattern and so on and so forth. The result was the development of many video compression techniques at bell labs, in which Netravali played a major role.
This led to the concept of high quality digital TV broadcast rather than flickering analogue images. But there is a long chasm between a consumer friendly concept and a whole industry accepting it as a standard. To persuade the skeptics Netravali and his team set up the demonstration of such a digital TV broadcast in the 1984 Olympics at Los Angeles. However we remember the 1984 Games today for the success of Carl Lewis and the heroics of our own P T Usha and not the digital TV. Soon enough Netravali got enormous peer recognition. The IEEE medals, fellowship of the US National Academy of Engineering, the position of President of Bell Labs,National Technology Medal from the US President and Padma Bhushan from the Indian government, however he could not get over the fact that the global politics of broadcast standards the cost of leaving the old analogue technology by broadcasters, TV makers and the viewers would always brand his work as “one ahead of its time”. But the 21st century has changed all that. Today the rage of US TV industry is High Definition TV (HDTV) and Arun Netravali is a fulfilled man.
What is the moral of these stories?
Technology, unlike science, does not lead to a new theorem or another charmed quark or the secrets of a fold in a protein, all of which will be appreciated as breakthroughs in knowledge. But it creates products, which are primarily used by other human beings. Thus the user—human being—and his intelligence, stupidity, frailty, habit, curiosity, variable sensory and cognitive capabilities have to be kept in mind while developing products. An engineer is normally not sensitive to these things. He looks at speed, robustness, reliability, scalability, power consumption, life cycle cost etc. There are innumerable examples of such products of pure engineering genius, bombing in the market place. But we in the Indian tech companies have not learnt the lesson yet. A North American colleague recently remarked, “I have seen enough philosophy, psychology, history and English majors in US companies but in India I see 99.99% engineers. And that is their strength and weakness!”
If innovation is the bridge to survival and prosperity in the new economy then a diversity of knowledge bases, soft sciences and hard technologies need to be put together in the cauldron and hope for the best to come out of the brew!
Linux and all that
The penguin has arrived
Tux, the penguin, symbol of Linux, is spreading out of the sweaty rooms of ponytails into boardrooms of pin stripes, as it promises the Nirvana of lower IT infrastructure costs while making it more secure.
Shivanand Kanavi
If one were inclined towards numerology then one could try playing around with the number 1991 in many ways, to read a turning point hidden somewhere. After all it proved to be so. That year the cold war ended with the collapse of the former Soviet Union and changed the bipolar world. It started a massive wave of economic restructuring the world over that still continues. That was also the year two seemingly innocuous initiatives were taken by technologists that are changing today’s world.
One was an information management idea at the European Centre for Nuclear Research (CERN), Geneva, penned by Tim Berners-Lee in a proposal to his boss. His ideas led to the World Wide Web. Tim Berners-Lee refused to patent the idea and earn money from it. Today, he is evangelising the development of next generation Web technologies called the Semantic Web, as head of the W3C forum at MIT.
The other was a piece of specialised software called an Operating System (OS), written by a computer science student, Linus Torvalds, at the University of Helsinki, Finland. He posted it on the Internet with a modest comment: “Hello everybody out there, using Minix. I'm doing a free operating system. Just a hobby, won't be big and professional. This has been brewing since April, and is starting to get ready. I'll get something practical within a few months, and I'd like to know what features most people would want. Any suggestions are welcome, but I won't promise I'll implement them :-) Linus”.
Torvalds was using a PC with an Intel 386 chip and did not believe that his operating system would work on anything more complex than the hard disk of his PC. Today his OS, appropriately named by him—Linux, is powering a growing number of servers, thereby causing any number of managers in technology giants like Sun and Microsoft to reach for antacids and aspirins.
Unix, Windows, and Linux
Unix has its origins in the frustrations faced by researchers in pioneering projects. The project involved developing a computer utility—way back in the early sixties—which could be shared by many users like power or water. It was called MULTICS (Multiplexed Information and Computing Service). If it sounds like the current buzzwords like ‘grid computing’ and ‘computing on the tap’, then you know exactly how ‘original’ those terms are!
The project involved MIT, Bell Labs and GEC. The project led to many pioneering concepts in software and operating systems but took too long to fructify. As a result GE which at one time had a plan to enter computing in a big way altogether exited from the field. Bell Labs too dropped out of the project while MIT chugged along for a long time.
A couple of engineers at Bell Labs, Ken Thomson and Dennis Ritchie, (who also created the programming language C), who had worked on as part of the DARPA project developed a single user operating system and called it UNIX (Uniplexed Information and Computing Service). It was also a pun on the name of the original OS.
Thus was born UNIX in 1971. However, due to strict anti-trust laws which prohibited AT&T (which then owned Bell Labs) from entering other fields than telecommunications, the company was forced to give away the source code to various universities. Thus Unix became very popular at Stanford and Berkeley. In fact Sun Microsystems was inspired by the Stanford University Network and later developed its own version of Unix called Solaris. Berkeley developed its own free version of Unix called BSD.
Novell another networking company saw the possibility of client server networks proliferating and developed its own network operating system called Netware, which is still very popular. As Microsoft saw an opportunity to grow from desktop operating systems (DOS and Windows) to a network operating system it developed Windows NT and later Windows 2000, which obviously had several features of Unix and Novell’s Netware.
Mean while Andrew Tannenbaum, a well-known tech teacher and writer wrote a small operating system to teach his students in 1987, called Minix. It was a great teaching aid. But it also had deficiencies. Tannenbaum, however refused to answer his critics and increase the complexity of his Minix. Linus Torvalds went ahead in 1991 and tried to improve Minix and called it Linux. According to Ragib Hasan, a Linux enthusiast from Bangladesh, “The nerds of the world took up Torvalds’ challenge. Of Linux today, only about 2% was written by the ‘master’ himself, though he remains the ultimate authority on what new code and innovations are incorporated into it. Because the original quantities and instructions that make up Linux have been published, any programmer can see what it is doing, how it does it and, possibly, how it could do it better. Torvalds did not invent the concept of open programming but Linux is its first success story. Indeed, it probably could not have succeeded before the Internet had linked the disparate world of computing experts”.
According to independent technology market researchers like IDC, Linux is today the fastest growing OS in servers. True it does not sound as revolutionary as the World Wide Web. But large number of people in India and other countries cannot afford to buy expensive software for home use. Governments starved of funds cannot use IT extensively for the much needed citizen services and governance. Consequently, the ‘digital divide’—a term used to denote the lack of access to computing and Internet to the masses—can turn into an unbridgeable chasm. That is why free Linux and a low cost IT infrastructure built on it seem to be the way out of the digital cul-de-sac.
Even if big businesses can afford IT costs, a rupee saved is a rupee gained and nobody can afford to be profligate. In these days of tech skepticism when CFOs of even the largest corporations in the world are asking questions on IT spending and the return on investment, any development that reduces the cost of technology sounds very attractive. That is the reason big bets are being placed on Linux, by giants like IBM, HP, Dell and Oracle. Despite its youth several users are also ready to bet on Linux.
According to a cover story in Business Week (March 3, 2003), Wall Street’s Investment Bankers—one of the most tech savvy crowd in the world—have already switched a majority of their in-house servers to Linux. Morgan Stanley for example hopes to save $100 million in the next five years by switching 4000 high-end servers to much cheaper Linux based servers.
Our own technologist-politician, President APJ Abdul Kalam, warmed the cockles of many a techie’s heart, when he said recently (see Financial Express May 29, 2003), that it is imperative for India to go for ‘open source’ software. Linux is the most well-known open source code software.
Vinod Khosla, co-founder of Sun Microsystems and a leading venture capitalist in the Silicon Valley, agrees with Kalam. “I do believe India and China should coordinate their strategies in technology and software. There are many open source technologies, all the way from operating systems to applications that will work well if the two countries work together. It will help them train their people, keep costs much lower and improve their strategic importance to the world of technology”, he adds.
Competing OS like Unix and Windows have earned their vendors tens of billions of dollars but the guy who started the Linux movement in 1991, Linus Torvalds, did not earn a cent from it. He gave it away free. Anybody could download it from the web, improve it and put up the improvements on the web for others to scrutinise and use but not sell.
Naturally, what Tim Berners-Lee and Torvalds miss in dollars is more than made up by the huge fan following they have.
When Michael Douglas in his award-winning role of a takeover tycoon, in a Hollywood movie Wall Street, said, “Capitalism is based on greed”, he was only stating the stark truth that is rarely mentioned in polite company. However, the quality of our life has changed not only by the enlightened self-interest of individuals, but even more profoundly by the ideas and deeds of visionaries and savants who give it all away.
Is ‘open source code’ such a novel thing? Did Amir Khusro claim intellectual property rights over Hindustani ragas, or Purandara Dasa on Karnatak music? In our own times, Bohr did not patent quantum theory nor Einstein relativity and E=mc2 , nor did John von Neumann his path breaking architecture of the digital computer. All of science, mathematics, classical music or philosophy is ‘Open Source Code’. They are all ‘peer reviewed’ and they inspire new developments.
If Linux is geeky flower power, a product of software ponytails, then how does it fit into the business plans of pin stripes in big companies like IBM and Oracle, HP and Dell? Linux can be downloaded from the Internet for free or bought for a very small fee from various vendors who provide it in a set of CD-ROMs along with several applications. Once you buy a copy you can install it on any number of PCs without the fear of being called a software pirate, because it is legal.
However, while Linux itself is free, applications built to be compatible with it, need not be. Thus an Oracle 9i database built on Linux is not free but Openoffice, which does everything that Microsoft Office does, is. More over one needs to spend money on Linux consultants for support, customisation, implementation etc. But considering the total cost of ownership, according to an IDC white paper, Linux still scores over other competitors by a wide margin. In Internet related services Linux scores in costs by leagues and even in other services the gap is considerable.
The need for consultants and the growth rate of Linux technologies and applications in western markets has piqued the interest of Indian IT services companies as well. Infosys is involved in acquiring Linux skills though it is too early to talk about it according to company sources. But TCS is already knee deep into Linux. According to Gautam Shroff, who heads the architecture and technology consulting practice at TCS, “We have a main frame Linux lab in Chennai with IBM mainframes, an Intel Performance Lab in Mumbai for testing Linux under stressful conditions on Intel servers. In Delhi we have a dedicated lab to provide proof of concept for end-to- end Linux solutions for enterprises. Some of our packaged banking solutions are already available on Linux. As consultants we are also helping our customers chalk out their Linux strategy, based on our experience with the Linux platform”
If the market expands due to lower cost to the end user then clearly the application software companies will be more than happy to eliminate a layer of proprietary OS vendors like Microsoft and Sun. In the case of Microsoft they might even do so with a glint in their eyes. The move is already paying dividends. For example Intel was a late entrant into the server market but inexpensive Intel servers called blade servers, are selling like hot cakes. PC assemblers like Dell who diversified into cheap servers are also showing a high rate of growth. Loading Linux on them has definitely helped. The Intel-Microsoft alliance called ‘Wintel’, dominated the PC market. But today there is a much talked about ‘Lintel’ as Linux servers being sold by Dell and IBM are showing huge growth rate. No doubt the absolute numbers are still small. For example according to a Merril Lynch report dated March, 5, 2003, the sales of Unix servers in the Sep-Dec, 2002, amounted to $5.6 billion, and those of Windows based servers accounted for $3.8 billion, while those based on Linux only amounted to a ‘paltry’ $681 million.
Then why antacids at Sun and Microsoft? Well, it is the growth rate, silly. In the last quarter of 2002, Unix server sales fell by 10% (year over year) and sales of Windows servers rose by 6%. But Linux servers clocked a scorching 38% rise!
IBM’s CEO Sam Palmisano is reputed to have asked his colleagues in December 1999 as he took over the leadership of Big Blue, on what new technologies to bet on. One clear answer from his team was Linux. As a result IBM has already spent over a billion dollars in developing the hardware, software and services for Linux platform. IBM has also built alliances with five global Linux distributors: RedHat, Caldera Systems, SuSe, Turbolinux and Connectiva. Today, it has deployed over 1500 engineers for Linux development. No wonder when Palmisano made very low profile visit to Bangalore last year, Chief Minister S M Krishna invited him to set up a Linux development centre at Hubli, in the North of Karnataka.
IBM’s conversion to Linux is all the more remarkable because IBM pushed its own proprietary operating system called AIX on its own RISC chips till recently. Today it is not shy of evangelising the open source Linux on servers with Intel chips.
Oracle’s Larry Ellison too has been very bullish on Linux. “We are already practicing what we are preaching. Oracle Corp is converting all its IT infrastructure to the Linux platform. In fact we expect the next quantum of cost savings leading to a higher profit margin of close to 40%, to come from this migration among other things” says Shekhar Dasgupta, MD, Oracle India.
“Oracle is fully committed to supporting the Linux operating system. Ours was the first commercial database available on Linux. We believe that Linux is more attractive today than it ever was, as customers are looking for cost-effective solutions. Over the past few years Oracle and its customers have learned a tremendous amount about running Oracle on Linux for enterprise class deployments. Combining this knowledge with the opportunity to drastically reduce IT infrastructure costs has provided the catalyst for Oracle to move to the next step which is to even provide front-line technical support for the Linux itself in addition to supporting the Oracle stack”, adds Dasgupta.
It is not just a matter of price, there is also increasing concern, bordering paranoia, on security and reliability. Corporations and governments cannot afford a downtime in their servers because the OS crashed nor can they tolerate a virus or a hacker attack. On that score Unix has had very high standards and consequently dominated high-end mission critical servers. Windows however has had a checkered history in this regard and hence few want to risk basing critical applications on it.
But Linux, which has Unix like features, has proved to be very robust. Says, D Seetharam, country manager, government relations, IBM, India, “Among other technical things, the very design of Linux makes it more difficult for viruses to spread. More over since the source code is open for inspection and public comment of the entire developer community, the glitches get ironed out before official release.”
Naturally, one of the biggest users of Linux in India are defence related servers. According to Javed Tapia, director Red Hat India, Linux deployment in Pakistan is ahead of India and it is growing in Sri Lanka as well. Governments elsewhere too are recommending Linux. Governments in Germany, China and Taiwan are already big users and European commission too has issued a circular regarding the same.
Many users in India seem to be waiting for the lead to be taken in western markets. However, Tapia waxes eloquent on the Linux deployment in Central Bank and IRCTC. Central Bank has used Linux in all its 619 branches in total banking automation solution while the IRCTC has deployed Oracle’s ebusiness suite to automate and streamline processes in over 30 locations across India.
“We are implementing an ERP solution on Red Hat Linux Advanced Server. Our initial reaction—Linux seems to be the answer for enterprise wide low cost computing. The final word will of course have to wait for the full roll out”, says Amitabha Pandey, group general manager IT services at IRCTC. “We are probably the first full scale ERP implementation on Linux in India”, he adds.
Seeing the direction of the wind, Sun Microsystems too is running behind the bandwagon to get a look see. It recently backed Linux in a limited way for desktop computers. A segment, which is not its forte. It has released a Linux based version of an office productivity tool called StarOffice, which is much like Microsoft’s money spinning MS Office.
RedHat’s Linux 9.0 comes prepackaged with OpenOffice and other tools, games and even a programming environment. “If you look at a comparable package from Microsoft then you will probably spend at least as much on the software as on the hardware. There by doubling the entry barrier to home personal computing”, says Shashi Unni, a RedHat training expert.
Under these circumstances, Linux should spread like wild fire in desktop PCs. But it is not so. The reason is two fold. One relates to environment and the other to the youth of technology. Small enterprises and home users in India use illegal copies of both OS and applications without any compunction. Thus if you tell them that Linux comes almost free, it makes no difference to them. It is only when they see virus attacks, frequent crashes etc that they can start seeing the advantage of using Linux. Secondly Windows has been the most successful the desktop. Hence manufacturers of hardware peripherals like modem cards, web cameras, scanners, printers etc. have invetsed in writing software called ‘drivers’ based on Windows so that the machine automatically recognises the new peripheral. Most of these hardware manufacturers are yet to provide Linux compatible drivers to users. So one can find after loading the latest version of Linux that the internal modem card is not recognised by the OS. A major irritant as Internet access is one the main functions of a PC.
External modems however have no problem with Linux. “But this problem cannot be wished away”, admits Tapia of RedHat. “Till hardware vendors start providing Linux compatible drivers, which is not too far away, we have an alternative strategy. We are working with PC vendors and providing Linux certified hardware list to them so that one can just load Linux and plug and play”, he adds.
Already major PC vendors in India are offering Linux loaded PCs at a price, which is almost 30% less than Windows loaded ones. After all who would not like to have an IBM PC or an Acer Laptop, which comes with all warranties and legal software but competes with the neighbourhood assembler’s price?
The low cost and technical robustness along with the opportunity to modify and develop it further, has made Linux highly popular among India’s tech power houses like IITs, BARC and TIFR. In the US too four leading scientific laboratories: National Center for Supercomputing Applications at the University of Illinois (NCSA), San Diego Supercomputer Center (SDSC) at the University of California, Argonne National Laboratory in Chicago; California Institute of Technology in Pasadena are building a very high powered grid of supercomputers powered by Linux.
If we want an IT enabled nation then clearly Linux offers the best bet at the moment. Already Linux distributors and consultants like RedHat are working on Indian language support in Linux making it even more attractive.
We say Amen to that.
A Hollywood Beauty and CDMA
Reaching out with spread spectrum
A 60-year-old idea patented by a Hollywood actress is revolutionising wireless technology.
Shivanand Kanavi
Tom Cruise and Nicole Kidman, the famous Hollywood star couple, are deeply upset. Recently, a man used a commonly available frequency scanner to find out what frequency their cellular phones were using. He then proceeded to not only snoop into their private conversation but tape it and sell it to a tabloid in the US. The episode brought into the limelight the lack of privacy in a cellular phone call. However, if they were using a cell phone based on spread spectrum technology like CDMA then such snooping in would not have been possible.
Spread Spectrum technology assures a high level of security and privacy in any wireless communication. It has come into the limelight in the past decade and especially in the past five years, after a US company, Qualcomm, demonstrated its successful application for cellular phones. Since SS technology can be used for secure communications, which cannot be jammed or snooped into, the US military has done extensive research and development on it since the 1960s and 1970s. Ironically, this hi-tech, and revolutionary concept in radio communication was patented by a Hollywood diva, Hedy Lamarr, nearly 60 years ago.
Hedy Lamarr--Unlikely inventor
Though gorgeous and glamorous, she was not a bimbo. Hedy Lamarr hit headlines as an actress with a nude swimming scene in her Czech film Ecstasy(1933). Later she was married to a rich pro-Nazi arms merchant, Fritz Mandl. To Mandl she was a trophy wife, whom he took along to many parties and dinners, to mingle with the high and mighty in politics, military and arms trade of Europe. Little did he suspect that beneath the beautiful exterior lay a sharp brain with an aptitude for technology. Hedy was able to pick up quite a bit of the technical shop-talk of the men around the table.
When the war began, Hedy, a staunch anti-Nazi escaped to London. There she convinced Louis Mayer of MGM studios to sign her up. Mayer, having heard of her reputation after Ecstacy, advised her to change her name from Hedwig Eva Marie Kiesler to 'Hedy Lamarr' and to act in "wholesome family movies", which she promptly agreed to.
As the war progressed and US entered it after Pearl Harbour, Hedy informed the US government that she was privy to considerable amount of Axis war technology and she wanted to help. The Defence Department had little faith in her claims and advised her to sell war bonds. Hedy, however was unrelenting. She, along with her friend George Antheil, an avant garde composer and musician, patented their 'secret communication system' (1941) and gave the patent rights free to the US military. The patent discussed a design to provide jamming free radio guidance systems for submarine launched torpedoes based on the frequency hopping spread spectrum technique. It consisted of two identical punched paper rolls. One roll, which was located in the submarine, changed the transmission frequency as it was rotated and the other embedded in the torpedo aided the receiver in hopping to the appropriate frequency. The enemy jammer would be thus left perennially guessing the guiding frequency.
The idea though ingenious was too cumbersome as it involved mechanical systems and was hence not applied by the US Navy. However, in the late 1950s as electronic computers appeared on the scene, the US Navy revived its interest in Hedy's ideas. Subsequently, with the development of microchips and digital communication, very advanced secure communication systems have been developed for military purposes using spread spectrum techniques. In the telecom revolution of the 1990s, these techniques have been used to develop civilian applications in cellular phones, wireless in local loop, Personal Communication Systems and so on. The unlikely inventor showed that if you have a sharp brain even party hopping could lead to frequency hopping!
Spread Spectrum
Instead of using one fixed frequency, what if the transmitter keeps jumping from one to another in a random fashion? Then by the time the "enemy", who wants to snoop in, or who wants to jam the transmission, finds the frequency with a high-speed scanner, the frequency would have changed. As long as the hopping does not have a pattern that can be detected and the receiver knows the exact sequence of hopping then both snooping and jamming would be impossible. Thus a user does not use a channel but many users can use a band as long as their sequences do not dash. This technique is called frequency hopping spread spectrum. Hedy's idea belonged to this set.
Another set of techniques belongs to the direct sequence SS, where a signal is mixed with a strong dose of noise and transmitted. Only the receiver knows the noise that has been added and hence he subtracts the same from the received signal, thereby recovering the transmitted signal. This technique works best when the added noise is very powerful. (In reality, noise means a completely random jumble of power output in all frequencies. In this case the jumble is not totally random but only the receiver is privy to it hence it is called pseudo-noise). Normally noise is acquired during transmission, like the static hiss and other crackling sounds in a radio during a storm. Hence, every radio engineer tries to broadcast the signal at a high power so that at the receiver's end the signal to noise ratio is high enough for the message to be intelligible. After all, the received power falls inversely as the square of the distance from the transmitter while the "noise" level does not. Direct sequence SS turns this situation upside down; it needs a weak signal and a strong noise to be effective. An everyday example of this can be seen at a cocktail party. When the -wise level is very high, there is maximum privacy for conversation between neighbours because both ears discount the same background noise while the third person hears only the noise! Qualcomm's CDMA belongs to this set of direct sequence techniques.
Breaking out of channelsSS techniques violate all conventional wireless wisdom. Traditional wireless systems however sophisticated, are based on the channel separation principle. That is, each user has to use a fixed radio frequency or a small part of the spectrum exclusively. In the very early days of wireless itself, when two pioneers, Marconi and De Forest, were vying to show off the superiority of their respective radios, the problem of interference appeared that has dogged wireless to this day. Both of them were then trying to sell their ideas to the public and went on to broadcast the first-ever live commentary of a yachting event. However they found to their dismay that their broadcast frequencies were too close to each other. This led to interference or unintentional jamming and all that the listeners could receive was garbled sound. Hence the principle, "only one transmitter can use a frequency at any time". Since the radio frequency spectrum is limited, the allocation of frequencies has become a major regulatory job. It is done by the International Telecom Union at the international level and bodies like the Wireless Planning Committee in India and the Federal Communications Commission in the US at a national level.
Channelisation is ingrained into the thinking of radio engineers. They strive for better transmit filters to contain the transmiSSion in a narrow channel. They strive for better receive filters to reject any interference that may assault their receiver. They strive for hyperstable frequency synthesisers to keep the carrier tuned as sharply as possible. Because of scarcity of spectrum, radio engineers continuously look for ways to narrow bandwidth by channel splitting, various multiplexing techniques, better coders and modulators-demodulators (modems) and so on. Thus each cellular operator in India who has been given about 12.4 MHz of spectrum can accommodate about 200 simultaneous users in each cell. However SS techniques developed in the past decade have demonstrated that even in a mobile environment they can accommodate 10-20 times more users than analog cellular systems and four-seven times more users than traditional digital systems, though they violate the basic concept of channelisation. In fact, SS techniques work better and hence more efficiently in wider bandwidths. To a traditional radio engineer, SS enthusiasts appear as wild-eyed hippies.
Similarly, a major problem in mobile phones is what is called "multipath", that is the same signal gets reflected by various geographical features and reaches the receiver at different times leading to fading in and out of voice. Multipath is a frequency dependent effect hence it does not affect SS based systems as the broadcast is not at one frequency but a whole bunch of them in a wide band.
Having proven its superiority over traditional wireless technology, SS is becoming more and more popular especially in fixed wireless applications. Most of the new basic telephone operators who are using wireless in local loop to connect their exchanges with customers will be using CDMA. Even the next generation of traditional cellular phones- 3G, will incorporate this technology.
Friday, July 27, 2007
Nuclear, anti-nuclear
Our lonely nuclear high priests
Shivanand Kanavi
In a ceremony at Trombay on January 20, 1957, to name the first swimming pool type reactor, APSARA, Jawaharlal Nehru made some perceptive remarks. “I am happy to be here,” he said “not because I know very much about atomic energy or reactors, in spite of the numerous attempts Dr Bhabha and Dr Krishnan have made to educated me, but without understanding the intricacies of these mysteries, I hope I have some conception of the importance in this world of ours, of the release of this great power.”
“In the old days, the men of religion talked about mysteries. In ancient Greece, there were the mysteries. High priests who apparently knew about these mysteries exercised a great amount of influence on the common people who did not understand them. In every country that was so. The high priests in those days possibly dominated the thinking in many countries with their mysterious functions, ceremonies and rituals.”
“Now we have these mysteries, which these high priests of science flourish before us, not only flourish but threaten us with; and at any rate make us full of wonder or full of fear. Whatever it is, we have got these new mysteries of science, and of higher mathematics, which are unveiling various aspects of the physical world to us. No one knows where this will go.”
In a flash of brilliance, Nehru had captured the predicament of the common man when faced with the mysteries of nature and high priests of science –feeling both wonder and fear.
Our atomic scientists, who best represented the growth of science and technology in modern India, are no longer the unalloyed heroes they were in the fifties and sixties in the public perception. Why is this so? Have we come half circle from wonder to the fear that Nehru perceived?
From the euphoria of the fifties to anti-nuclear agitations and litigations of the eighties and nineties, a section of our intelligentsia seem to have made an about- turn. Some even sound like anti-science mystics. This phenomenon is worth investgating seriously. Otherwise, we will be lost in pro-nuclear, anti-nuclear dogma and rhetoric.
Pro- nukes call the anti-nukes irrational, stubborn, callous towards mass poverty-alleviation, radical chic and even agents of western imperialism who are trying to force India to go slow on its nuclear programme and eventually sign the hegemonistic nuclear Non-Proliferation Treaty.
The anti-nukes call the other side ivory-tower technocrats, science fundamentalists, reductionists (a new abuse), brainwashed by western concept of material progress etc. it is clear that the dispute has crossed limits of decency and has become a them and us confrontation
A good example is that of Dr Shivaram Karanth. More than four decades ago he compiled and published, at his own cost, a beautifully illustrated and lucidly written three-volume children’s encyclopedia called Bala Jagattu in Kannada and a two-volume science encyclopedia called Vijnana Prapancha .
It was a pioneering effort in popular science writing in Kannada. Today, Karanth, a Gyanpeeth laureate and a nonagenarian intellectual, is fighting a prolonged battle in the Supreme Court against the nuclear power plant at Kaiga, 60 Kilometeres from Karwar, in Karanataka.
These developments appear to have demoralized nuclear scientists and engineers. They have not come to terms with their transformation from heroes to villains, from nation builders to potential destroyers. I have seen bewilderment expressed in numerous conversations. More than the resource crunch, what seem have hit them is this fall from grace.
The Pressurized Heavy Water Reactor that is being installed in Kaiga has been developed by Indians and is a credit to their skills, it is much safer than the type that was used in Chernobyl. It uses natural uranium as against the enriched uranium that Western nation have refused to sell to India if it does not sign the NPT.
The Kaiga plant will produce 440 Mw of electricity to begin with and ultimately 1,400 Mw, shoring up the infrastructure in power-hungry Karnataka. The plant has enough in-built safety devices. Of course accidents can happen any were. The rain forest cut to clear the land for the project has been adequately compensated by reforestation both in Kaiga and in distant Chamarajnagar and Mandya. It is to be noted that the forest cleared is about five percent of what has been destroyed in 1,200 Mw Kalinadi hydroelectric project.
Taking note of all these factors, the Supreme Court in a recent judgment clubbed all the petitions filed against the Kaiga project together and dismissed them. The court has advised the Department of Atomic Energy that if it so desires, it can give a hearing to the petitioners’ grievance. The department has given the petitioners opportunity to prepare a brief using the National Environmental Engineering Research Institute(NEERI) report on environmental impact of the project if necessary and submit it for discussion.
Clearly, the Supreme Court verdict is a vindication of the stand of the nuclear scientist and not that of the anti-nuclear movement that had concentrated on the safety aspect and destruction of the rain forest in Kaiga. Has the issue been settled? It is doubtful. There may be agitation again. “Why is it that issues, which can be explained with facts and figures and reasoning are not understood by some of our intellectuals?” asked an exasperated nuclear scientist.
It could be pointed out to him that issues regarding waste disposal and even closing down the reactor after its useful life –normally 25 years- are still to be solved satisfactorily. Moreover why should people blindly believe the high priests of science? For example, did our framers know about the use of chemical pesticides for thousands of years? Then came the Green Revolution in the sixties and our agricultural experts from universities and agro-corporations taught our farmers the new technology.
About two decades later, a factory manufacturing pesticides in Bhopal leaked deadly methyl iso-cyanate and killed thousands in their sleep. Now how can you expect the common man to take your word for it? There is bound to be adverse reaction against science and technology. Some of it may be justified and a lot of it may be irrational fear. But the science establishment has not yet learnt to deal with it.
There are basically two reasons for this development and they have to be dealt with separately. One is the lack of information about science and technology among laymen, which naturally means increased effort in popularizing science. A number of organisations are finally realizing this.
The establishment of the Directorate of Environment and public Awareness within the Nuclear Power Corporation is an example of this. But the second reason is more complex and needs to be dealt with at different levels. That has to do with the alienation of the state and governance from the people. The marginalization felt today is so acute that anything that has to do with the government and comes from some office in New Delhi or a state capital immediately raises the hackles of many people.
The corruption, arrogance and later brute violence that is increasingly being associated with the state, repels many Indians. Scientists working with the government are tainted, by association with such an apparatus.
The solution is not within the ambit of scientists alone. It is an urgent national problem that requires effort from all of us. But scientists working in the government have to be more responsive to public opinion and do their best to win over their opponents. The haughty style of innuendo and ridicule and answering questions only when they are asked in the Lok Sabha has to change.
For example when I asked our nuclear scientists why they do not meet Dr Karnath and others and win them over, they had no answer. Their approach was limited to a few pamphlets, press statements and a debate five years back in Bangalore.
I think they should read Nehru’s speech carefully and emerge from their ivory towers.
Arthashastra
The Real Chanakya is lost in CHANAKYA, the TV Serial
Shivanand Kanavi
Many viewers are beginning to see red on seeing saffron in the popular TV serial, Chanakya. This, in turn, has sparked off a debate in the media about the serial’s historical authenticity. Some have voiced skepticism over whether a saffron flag had existed in the fourth century B.C. Others have implied that Dr. Chandraprakash Dwivedi, the writer-director-actor of the serial, is using it to propagate Hindutva.
The Controversy is unnecessary. The serial, set against a historical backdrop, is fictional. In fact, the controversy would not have arisen if Doordarshan had learnt a lesson from the dispute over the historical authenticity of The Sword Of Tipu Sultan, and made an announcement before every episode that the serial is fictional. But who was Chanakya or Kautilya? We know nothing about his personal life. We have some details about Chandragupta Maurya from Greek sources, who refer to him as Sandrokottos. But even these reports survive as fragmented quotations in other works- the original is untraceable. As D D Kosambi points out, as far as Chanakya is concerned, we only have legends fictionalized through the famous Sanskrit play, Mudra Rakshasa, written by Visakhadatta in the fourth century AD.
Today, all we know about Chanakya is only through his work, Arthashastra. This work was studied until the twelfth century and then was lost for many centuries. It was rediscovered in 1905, but only in parts. As R P Kangle, in his preface to Arthashastra – Part II (Bombay University Publication, Second edition, 1972) points out, no entire copy of Arthashastra has been recovered so far. The ancient Sanskrit used in the text is also open to many interpretations besides the copiers’ errors and interpolations. At the end of it all, Arthashastra is a treatise on political economy and does not say anything about the author direct. Thus anybody claiming to know about Chanakya the person should not be taken too seriously.
But having said all this, one cannot help but be impressed by the author of the Arthashastra. In fact, we have to be extremely thankful to him for providing invaluable information about the economy and politics of fourth century Magadh. He writes in a clear-cut, terse style with no scope for cant.
A quick glance at the Arthashastra yields rich insights into the state of Magadh. The title translates as “the science of material gain”. In the very first verse, the author acknowledges his debt to other ancient theoreticians and modestly says that he has done nothing more than compile and survey other masters’ views. He puts fourth the aim of the book as teaching the ruler “how to acquire and protect his Kingdom”.
The state Magadh appears very privileged indeed. It was the main land-clearing agency in the primeval forests surrounding pockets of population in the Indo-Gangetic plain and later also in the Deccan. It was the largest landowner and the principal owner of mines. Even a quick scan of the Arthashastra yields a rich insight into Magadh. After detailing all the precautions the ruler should take against corrupt state servants, Chanakya admits that it is as difficult to detect an official dipping into the state’s revenues, as it is to discover how much water has been drunk by swimming fish. The monarch, as he emerges in the Arthashastra, far from wallowing in luxury, was the most hard-working person in the kingdom, with his entire day strictly charted out with time set aside for sports, consultations with ministers and the head of the treasury and army, receiving secret reports from spies, interspersed by short spells of leisure.
Strife for the throne is treated as a minor occupational hazard of kingship. In fact, Chanakya quotes a predecessor’s axiom: “Princes, like crabs, are father eaters!” The Arthashastra never contemplates any interruption in the policy of state, no matter what happens in the palace. Externally the armed tribal oligarchies maintaining tribal exclusivity and some democratic traditions are considered serious obstacles to the absolutist state, both politically and ideologically. Ways to break up and subdue these tribal oligarchies are detailed in chanakyaneeti. Secret agents are not only used to spy on officials but also to monitor public pinion and even try to mould it through disinformation campaigns.
As far as the economy was concerned, Chanakya vigorously prompted direct settlements on waste lands and clearing forests for cultivation. Land was divided into leased land, on which taxes were collected, and vast crown lands, which the state cultivated. Productivity was a paramount consideration and if a lessee’s heirs did not cultivate the land properly then the lease was cancelled.
There was a form of social security for the aged, infirm, widows and pregnant women. The state maintained buffer stocks, not only of grain but also of essentials like timber, rope, tools, etc. to be distributed to the public during times of crises like famines or epidemics. Prostitution and wine production were legalised and taxed; in fact, there were separate ministries for them.
A more developed cash economy cannot be imagined. But all this was confined to the towns. The villages belonging to the vast crown lands were like camps of forced labour –increasing productivity was the only thing that mattered. The villagers were allowed no diversions of any kind.
The state also maintained a huge standing army estimated at half a million with handsome salary for the soldiers. But the cash was mopped up by the state by retailing the essentials to soldiers at inflated prices. The state monopoly in mining was crucial and Chanakya says, “the treasury is based on mining, army upon the treasury. He who has both can conquer the whole wide earth!”
In the event of financial emergency, methods similar to modern deficit financing was practiced. Chanakya also suggests alternative methods like state loans and National debts and even framing charges on rich merchants and making them pay in times of emergencies!
Chanakya certainly evokes interests among historians and students of political economy – but the Arthahshastra is of interest to students of Indian philosophy as well. In the very beginning of the Arthashastra, Chanakya enumerates what he considers as sciences worthy of being studied by the prince, who is training to be the future king. There he mentions philosophy; three (not four) Vedas, economics and science of politics as the four sciences to be studied diligently. In fact, he disagreed with the followers of Manu who regarded only the Vedas, economics and politics as the sciences. Chanakya extolled philosophy as the “lamp of all actions, support of all laws”.
But what is starling is what Chanakya considers as philosophy. He mentions only Samkhya, Yoga and Lokayata. Samkhya, it may be noted, is an intensely atheistic Indian trend which was naturalistic in outlook. Yoga here is not the Yoga of asanas but another name for Nyaya-Vaiseshika. This school extols doubt, debate, inference, syllogism and moreover, the atomic and molecular theory of matter. The third leg of his triad-- Lokayayata, is another name for Charvaka, a primitive materialistic trend. This trend was apparently highly respected by Chanakya, whereas later it was suppressed by followers of Manu. Lokayata now remains only in the polemics against it by its opponents and none of its works have been discovered. As Debiprasad Chattopdyaya noted, Chanakya while extolling philosophy and putting himself apart from the followers of Manu on this question, was actually extolling philosophy, in the broad sense of the term, and particularly those trends in Indian philosophy that had rational elements in them. Chanakya, thus, emerges from the Arthashastra as clear-thinking, bold, and rational theoretician.
Having said this, I would urge you all to watch the fictionalised Chanakya every Sunday morning and enjoy the histrionics of Chandraprakash Dwivedi and Co.