Monday, September 22, 2008

Moving to Wordpress

I am finding it extremely difficult to manage two blogs, and hence moving all the content over to my Wordpress blog. You can access the older content as well as new at

Monday, September 15, 2008

Lehman: Hubris, followed by Nemesis?

Financial Times has a very good read on the state of Lehman Brothers, which grew to be among the top 4 investment banks in the world. It says how Richard Fuld's never-say-die attitude has saved it in the past and put it on a growth trajectory, but this time, perhaps, he held back far too long [link shared by Rave]:

Lehman’s collapse is worrying for financial markets and for Wall Street as a whole. It is also a tragedy for its 24,000 employees, who were drilled into unwavering loyalty and cohesion by Mr Fuld. Many held a lot of their wealth in Lehman shares, which have lost most of their value.

It is also a tragedy for Mr Fuld, in the classical Greek sense. He had devoted so much of his life and his personality into moulding the bank he could not accept its decline. If he had sold out earlier, Lehman might have survived but he was too proud. It was hubris, followed by nemesis.

I hope there is still a white knight somewhere who can save Lehman, because I wonder if its collapse will bring down the house of cards.

Wednesday, September 10, 2008

Google Crawler hitch brings down UAL

This would rate pretty high in Ripley's Believe It or Not! Google posted a 2002 news story on its front page about UAL going bankrupt, which brought down UAL share prices rock bottom. [link]

Shares of UAL lost 75% of its value in seconds, plummeting as low as $3 from $12.30 prior to the story appearing on Google. Some investors in UAL stock lost a ton of money. The stock hit an all time low on heavy volume.

The shares bounced back after the market realized it was a 6-year old story on the company’s 2002 bankruptcy filing that appeared on Google. Investors who sold on the news were stuck.

Google declined comment on the incident. Later, it blamed the Sentinel for posting the 2002 Chicago Tribune article on their website. The Nasdaq Stock Market, where UAL shares are listed, said trades triggered by the erroneous report wouldn’t be rescinded. The Google story then was picked up by Income Securities Advisors, a Florida investment newsletter, and disseminated over Bloomberg News triggering a wave of panic selling. It appeared as ”United Airlines files for Ch. 11 to cut costs.”

Amazin' ain't it? No wonder Google is facing antitrust investigations due to the immense power it yields.

Wednesday, September 3, 2008

Chrome and Email check!

Google just released Chrome, its open source browser which has a bundle of new features like making tabs different processes. In some ways, it seems to be a throwback to the days of IE6 where tabs would actually be processes! However, the new paradigm gives a tabbed UI and a process based backend, which should be interesting to try.

One big advantage with Chrome is going to be finding out which processes are badly designed and hog memory -- something that is pointed out in this blog post.

I decided to give it a try to opening all the new emial services (Gmail, new Yahoo mail, and Live Mail) in separate tabs and testing their memory usage:

It seems to me that that all the mail programs actually use up a lot of space (and Gmail/Y!mail top it at around 20Mb each -- looks like there was some GC happening and Gmail process collected back some memory). Live mail seems to be the most lightweight!

However, what I am worried about -- if my mail tabs are using up that much space and it is only going to go up as these applications add more complexity, why would I not use a desktop based mail client, and switch to a web based client while on the go? A lot of times I have to face flaky connections, and it seems obvious that web based mail clients are downloading a ton of stuff everytime!

The pain point of offline mail clients is the ability to keep perfectly in sync with the server (IMAP notwithstanding and perhaps that should be worked upon) and the ability to install updates, due to which their UI is now lagging behind web based counterparts. If desktop applications can figure out some way of cleanly installing new features and keeping things completely upto date, would they get back in vogue?

Update: And this is what the condition was after leaving the tabs open for about 3+ hours:

Saturday, August 9, 2008

Why is FDI out of US more profitable than FDI into the US?

Mihir Desai of Harvard Business School says that portfolio investments into the US have been far more profitable than direct FDI investments. Inbound FDI into the US has averaged a return of 4.3% while outbound FDI from the US into other countries is about 12.1%. At the same time Wall Street went up more than any other markets in the world. Why is it so? Mainly because US companies traditionally invest in more controlled markets and have the advantage of getting cheaper cash and a better product and marketing portfolio (as a result of the controlled markets), while at the same time MNCs investing into the US have no such advantage of low-hanging fruit. [original article]

Why is it so difficult to make money as a direct investor in the
United States? Indeed, much of the rhetoric on investing environments
argues that the major destinations for U.S. outbound FDI—the developed
markets of Europe and Japan and the emerging markets of China and
India—are filled with capital controls and ownership restrictions. How
can the United States as a destination end up being so much less
attractive despite the relative absence of this usual litany of
investment obstacles?

Part of the answer may lie precisely in how these obstacles tilt the
playing field between local firms and multinational firms. In a series
of papers, [HBS associate professor] C. Fritz Foley, [University of
Michigan professor] James R. Hines Jr., and I have shown that distorted
environments are precisely where multinational firms have an advantage
relative to local firms. In countries with weak capital markets and
burdensome regulatory regimes, multinational firms can use their
internal capital and product markets to access global resources while
local firms can't. In effect, these distorted environments burden local
firms, create opportunities for institutional arbitrage for
multinational firms, and can lead to a successful set of foreign
activities for multinational firms.

The United States, in contrast, creates few such opportunities for
low-hanging fruit for foreign multinational firms relative to local
firms. As such, the conditions that may underpin the profitable
experience of U.S. firms as they expand abroad are not there for
foreign firms investing in the United States. More generally, the
presence of highly competitive local firms in the United States
undercuts efforts by foreign multinationals that don't have truly
differentiated capabilities. Simply replicating strategies that were
successful at home is likely to be insufficient in the United States.

Tuesday, August 5, 2008

Second Highest Bid auctions

Found this interesting post on Sriram Krishnan's blog where he describes the origin of the Vickrey auction that is used by Google and Yahoo! for the online advertising. Very interestingly, although it has some side-benefits of removing winner's curse and bid shading (see links on Sriram's blog), the real reason why this process was adopted [instead of the traditional English auction] is that the Google systems people wanted to reduce the loads on the server that would have resulted from people changing their bids rapidly:

There have been several articles documenting the work of Google's Salar Kamangar and Eric Veach in bringing this to AdWords. What is lesser known (atleast to me )is that they implemented this model to solve another problem entirely. I came across this old talk from a Google employee - in the speaker notes, it talks about how Kamangar and Veach implemented this feature to stop advertisers from logging into the system and modifying their bids constantly (since that's what people tend to do in an open English auction). By implementing a second price auction, they were hoping to reduce the load on the system.

Monday, July 28, 2008

Google, SEO, Knol and the rest of the world

Google recently launched Knol, their wikipedia competitor which allows experts to own articles. The concept is interesting because Wikipedia allows free-for-all authorship, and by making the articles edited by experts and listing their owners clearly on the knol, Google hopes it will get higher quality content. The editors will stake their prestige on the quality of the content, and overtime Google could also share Adsense revenue with them.

However, a has also raised quite a storm in the teacup since people are speculating that Google will take undue advantage of its search traffic to drive usage of knol. Google has pretty much become the traffic policeman of the new web -- telling people where to go, and getting them there through its vast knowledge of the contours of the internetland. However, as is often the case in India, what do you do when lawmakers become lawbreakers? When a cop's car breaks traffic rules, do you give them a ticket? While I am hopeful Google will not quite reach the level of Indian police (or even Bennet, Coleman & Co.), but the question of Knol getting undue advantage (as against the much better established Wikipedia) can not be brushed aside.

The importance of Google's dominance of the web came to the fore front yesterday during a discussion at the Open Coffee Club's first meeting in Kolkata yesterday. Angshuman of Taragana complained that he had a hard time when Google dropped him out of their indexes for some reason he is yet to figure out. While he has several conjectures such as his wordpress translation plugin due to Google might have labelled all his pages as duplicate/spam, or changing his URL syntax using mod_rewrite, he couldn't really figure out what the problem was. Using the webmaster tools wasn't much help either. Finally, the way he resolved it was by telling the Google representative that he would stop his Adsense spending if his website wasn't restored -- he claims that is the only thing that works with Google. Being dumped by Google indices is quite scary for any website owner, almost like not being reachable from the Start button on a windows box, and there needs to be better mechanism to deal with such 'mistakes'.

Microsoft has often been accused of using its Windows strength to push its other services, and now Google could do the same. While Google has been the poster child of the internet, and we all continue to use its services in good faith, ignoring trespasses into content creation space, brushing aside its transgressions as mere mistakes -- one can hear whispers today and one expects them to soon transform into noises. The onus is on Google to uphold its "don't be evil" philosophy, and communicate its positive action proactively to the rest of the world. It has already done well for the last few years, but the time has come to be more open, more forthcoming, and more accommodating, or might find itself in the same boat as what Microsoft, AT&T and other monopolies have been in the past.

Sunday, July 20, 2008

The sub-prime crisis from K@W

Just discovered a great resource on YouTube -- Knowledge@Wharton has a channel there. See this video on the sub-prime crisis:

To add to the video, I have also heard that once the sub-prime crisis started making its presence felt, the prices of the homes the sub-prime borrowers had bought fell and they realized that the amount they would pay was lower than what they would get by selling the houses. That only precipitated the crisis.

There are more interesting videos on the channel, including this one -- an interview with Sunil Mittal where he talks about entrepreneurship and his beginnings in the bicycle industry.

Sunday, June 15, 2008

Water Powered car unveiled in Japan

Genepax, a Japanese company, has unveiled a car that can run on water. It apparently extracts Hydrogen from water and uses it to create energy to fuel the car. The prototype was driven around in the city of Osaka in Japan. Engadget has more details:

The key to that system, it seems, is its membrane electrode assembly (or MEA), which contains a material that's capable of breaking down water into hydrogen and oxygen through a chemical reaction. Not surprisingly, the company isn't getting much more specific than that, with it only saying that it's adopted a "well-known process to produce hydrogen from water to the MEA." Currently, that system costs on the order of ¥2,000,000 (or about $18,700 -- not including the car), but company says that if it can get it into mass production that could be cut to ¥500,000 or less (or just under $5,000)

There is a video from Reuters that I have tried to embed below, but I am not sure if it will show up on the final blog (here's the link to the Reuter's page that houses the video):

Update: Looks like there's more to it than meets the eye. See this discussion on Slashdot.

Tuesday, April 1, 2008

EasyEclipse - Making life easy for Developers


I went back to Java after quite sometime, and had a tough time installing some plugins (Visual Editor in particular, it is not supported as yet on the current Europa release and only on the previous Callisto) this site is really handy in case you want to get a setup with everything you need already installed. They have bundled everything together and everything just works!

They have also divvied it up into broad areas such as 'Desktop', 'Web', 'LAMP' and so on -- targeted towards programmers in that category. Apart from the fact that they have these distros, it was also a great place to find out what were the most useful plugins for development on the eclipse platform. I didn't even know they have very useful plugins even for things like database management and SVN.

A life saver for people not experts on eclipse, I must say.

Thursday, March 27, 2008

The World is Round Again!

Came across an interesting article while browsing the net for Tata-JLo (!) deal yesterday. Pankaj Ghemawat, a chaired professor at Harvard Business School disagrees with Tom Friedman that globalization has reached its peak but instead believes that a lot of trade, immigration as well as "bits" travel only within national boundaries, and there is still a long way to go before we can knock down the walls we have built over centuries.

The findings fly in the face of Friedman's famous work. Take flows of people. Much as we would like to believe that this figure would be astronomically high, it is not. Says Ghemawat, "If you look at the stock of first-generation immigrants divided by the total population of the world, it is barely 2.9%."

In fact, he claims that in some metrics, we are just about reaching the 19th century level of globalization:

"On the people's side, the current ratio of immigrants to world population is slightly lower than in 1910. On the FDI side, we have probably reached new heights, but it wasn't until the 1990s that we got back to the FDI-to-GDP ratio that the world was seeing in 1901," says Ghemawat.

I can imagine this happening because of the FDI from Britain, France, and Spain into their colonies (which had been quite impoverished by then by monies being sent back as profits). A lot of flow today is in the reverse, the Tata-JLo deal being a case in point. It would be interesting to see detailed numbers, or perhaps they are present in the book.

In fact, at some point, I thought the claim that there is actually increasing localization of products which goes against globalization was being made. For instance, Coke and Wal-Mart and McDonalds have to take local tastes into account. I wonder if this would count as a case of more globalization or less globalization. I guess parts of it can be argued either way.

Link to the original article.

Thursday, February 21, 2008

Should prizes make a come back as against grants?

A very interesting article by Tim Harford about how prizes were a motivation for a big chunk of research which got productised earlier, and how it could be making a comeback. The advantage is quicker solutions, involvement of a more diverse community with more diverse ideas, cutting bureaucracy, fame and fortune for the inventors, and of course, problems getting solved. He cites how a competition was used to build an accurate clock used to predict the longitude of ships, and how today, from the Gates Foundation (for pneumococcal disesases) to Netflix (for machine learning algorithms) is using a cash prize as a motivation to involve people to solve important problems. It could also be used by governments to replace patents for solving large problems. He says:

Champions of prizes see them as a component of a wider system to promote innovation, rather than as an outright replacement either for grants or patents. Instead, the hope is that prizes will help to compensate for the specific weaknesses of those alternatives.

The downside of a patent is fundamental to its design: in order to reward an innovator, the patent confers a monopoly. Economists view this as, at best, a necessary evil since monopolies distort prices. In the hope of raising profits from some customers, they will price others out of a market. The most obvious victims are consumers in poor countries.

In an ideal world, prizes could replace patents. Instead of offering a patent for an innovation, the government could offer a prize. The inventor would pocket the prize but would not be allowed to exploit any monopoly power, so the innovation would be freely available to use in products for poor consumers – cheap drugs for Africa, for instance – and, importantly, in further innovations. But to explain that idea is to see its limitations. How could the government know enough about the costs and benefits – and even the very possibility – of an innovation to put a price tag on it and write the terms of reference for a prize competition? For this reason it is hard to see prizes replacing patents in most cases. But it is not impossible.

The modern heir to 18th-century prizes for canning, water turbines and finding longitude at sea is the advanced market commitment for vaccines for the poor: the goal is clear, the costs and benefits can be guessed at, and the quasi-prize nudges the patent system to one side with a prize contract that respects the patent but, in exchange for a large subsidy, radically constricts the holder’s right to exploit it.

Prizes may be an effective way to build technologies that solve a specific problem, but I doubt if they can help in unknown sojourns into the world of science. Most of our applied technologies are build upon these basic scientific fundamentals and I don't know if a gold-rush will lead to the newest laws of physics, or fundamental rules in mathematical logic. Issues of ownership of Intellectual property are also a little ambiguous, and have to be specified clearly up front. In many cases, gauging the ramifications of a new mathematical theory, or basic physical laws might be extremely difficult (which is the reason Nobel prizes are awarded after the work has been established over a long term).

All said and done, I am sure prizes (not just the cash, the fame and respect as well) make for great motivation and we might see a lot of it.

Friday, February 8, 2008

Bitwise 2008 - Can it get any better?

I still remember Bitwise 2006 very fondly -- all the last minute action, with the teams participating, running around to arrange problems, solutions, making sure everything runs properly. And it's been 2 years since then. It's very proud to see Bitwise 2008 progressing so well, with teams from almost 43 countries and clicks from 75! It can't get any bigger than this. It all started in 2001, and has come so far since then!

If you fancy yourself as a ace programmer, and you think you can unravel the double helix in your sleep, if you live and dream algorithms, crunch numbers when nobody's looking at you, you gotta participate in Bitwise, the real test of your abilities. It's the largest algorithm intensive online programming contest in India organized completely by students from one of the best engineering institutions in the country. You compete with the best brains in the area worldwide, and there are a sweet USD 2500 on offer as prizes.

Do you have it in you? Visit the Bitwise 2008 site and register NOW.

Saturday, January 12, 2008

Swarm Intelligence

In a previous post on the Honey-bee algorithm for allocating servers, which I found quite fascinating, I had pointed out I had referred to a paper on Swarm Intelligence by Eric Bonabeau and Christopher Meyer published by Harvard Business Review, and finally I got time to go back and read it and I found it quite fascinating! The paper describes case studies where people have used algorithms inspired by nature (ants, bees) which use a decentralized model of computation and optimization.

The paper points out that the main advantages of using algorithms like these are flexibility, robustness and self-organization. The algorithms work in a completely decentralized manner, and work on the principle that the wisdom of all the ants (or the small agents) can be harnessed in such a manner that the whole is far greater than the sum of its parts. Also, the algorithms are invariably robust to failure and adaptive since they don't make use of a central decision making bodies and there is a lot of experimentation with new sources of food (or results in the case of algorithms).

The paper also points out that there are several cases where these concepts have been used successfully (both in business and academia):

  • Optimally routing telephones calls and Internet data packets seems to be a tough problem because if we use a centralized algorithm, it will neither be robust nor adaptive. Algorithms based on swarm intelligence come to the rescue since they are not based on a central decision making body, but rather work on the principle that the scouts recruit other agents to follow new promising paths.
  • Fleet management and cargo management also suffer from similar problems. The paper points out that Southwest Airlines found out that in some cases, letting cargo go to wrong destinations and recovering is faster and more robust than always making sure that all cargo is handled correctly.
  • Small simple rules that lets people take decisions for themselves usually works best. This has since been shown to work very well for companies such as Google as well.

There are more case studies in the paper, but what's fascinating is that these techniques become even more popular now-a-days because companies have realized that it is easier to tolerate failure than to eradicate it -- more so in the context of the Internet where there is a race to build systems that are self-correcting (such as Map-Reduce, Hadoop and Dryad). Also the new realities of the emerging architectures (multi-core, highly parallel, massive clusters grids) is going to mean that we have more parallel horsepower to run our applications and such self-organizing algorithms are going to become even more popular in the computing community.

However, one concern would be programming models for such computing bedrocks. We still don't understand how to manage parallel computation very well to ensure that interpreting such algorithms in code is going to remain difficult for the average programmer for quite sometime.

Friday, January 11, 2008

Parallel Programming + Type inference + Scientific notation: A Winner?

I came across this article in Linux Today which describes Project Fortress, an open-source effort from Sun to provide a language based on Fortran to easily write parallel programs. The project seems to be built on top of Java. Some salient features seem to be:
  1. Implicit parallelism: If you want to execute a loop sequentially you have to explicitly write that. The big claim is of course, using this efficiently on multi-core machines.
  2. Support for unicode: As a result, the scientific research community can make use of greek alphabets in their code, and even use things like superscripts, subscripts, and hats and bars! This means that your code is going to look a lot more like your algorithm.
  3. Automated Type inference: The system has extensive type inference (the kind that functional languages and C# 3.0 have) and that means that your code is far more readable.
  4. Extensive library support: In fact, even some parts of the main system are implemented as libraries. They expose the parsed AST to the programmer, and give him extensive control.
These sound quite interesting, and it seems that the scientific computing language of the future is going to look a lot like Fortress, if they are successful with this effort.