I just finished Mass Effect 3, which I heartily recommend. Hell of a game. The ending was quite lame, sort of a cross between Contact and Terminator, but still. I imagine it must be hard to come up with an ending for a story of this scale.
For those unaware of it, Mass Effect is a series of games set in space in the 22nd century. One of the most interesting parts of the script is how it portrays artificial intelligence in this sci-fi universe. There´s some robots around (they call them 'mechs'), but of course that's hardware, which is unimportant. What's important is software, and in ME there's two kinds of intelligent software. There's VIs, virtual intelligences, which are human-like interfaces programmed for some task in particular. They speak and respond to language, but they are limited to whatever task they are programmed to do. It might be as complex as running a ship or controlling a factory by itself. But it's still a program.
Then there is AIs, artificial intelligences, which are self-aware, self-modifiable, fully sentient intelligences. What we normally think of as an AI. Those are generally deemed to be dangerous, as they tend to go rogue, but you can still find them now and then. In this fictitious world both VIs and AIs are incredibly advanced, yet people aren't much different from real life humans. 'Organics' are also said to have lots of enhancements, genetic engineering and whatnot; but it really doesn't show in the casting. Yours is no team of uber-geniuses.
But somebody has to code that software. Today we have made great progress in programming, but we aren't anywhere close to a functional AI. Much less a self-programming, singularity-inducing super AI. We simply don't have the skills, and won't have them for at least decades to come. I mean Google can't even design a proper email layout. And they're pretty good, relatively speaking.
Which takes me to the most likely scenario before we get a Singularity. Before we reach the point in which we can code a sort of human brain into a computer, we'll get complete knowledge on the genetic basis of intelligence. And we will be able to act upon that knowledge. Genetic engineering. Steve Hsu says it's easy. It certainly sounds so.
After we get that right, then we can go on with coding AIs and making fancy robots. I think we would need a 130 average IQ or so before we can have FTL travel and all that. But listening to the hype you would think it's just around the corner. We just need to wait. Meanwhile the 90% of software innovation today is spent on tracking people's browsing to sell them targeted ads. Or Groupon.
No, we're aren't going anywhere with software. The Singularity will be biological.
UPDATE: Greg Cochran seems to agree. If Cochran and Hsu say so, I'm in.