• March 17, 2006 |
8 p.m. ET
Fighting the next war:Three years after the war in Iraq began, the Pentagon is taking the lessons learned from past and present battles and turning them into technologies for future conflicts.
The best-known example is the multibillion-dollar effort to counter the roadside bombs that have been the bane of U.S. convoys. In the longer run, robotic vehicles pioneered at the DARPA Grand Challenge could reduce the risk to human warfighters — eliminating the scenario that imperiled Pfc. Jessica Lynch and her compatriots almost three years ago. Near-space platforms could someday put vital communication and reconnaissance posts far out of harm's way.
But the Pentagon's high-tech frontier includes smaller, shorter-term objectives as well as grand challenges: This week, the Defense Department announced its priorities for technology demonstrations for the current fiscal year — projects that aim to get supplies as well as information to the frontlines faster.
Currently, the Pentagon has two types of technological rapid-deployment programs: the advanced concept technology demonstrations, or ACTDs; and the brand-new joint capability technology demonstrations, or JCTDs. The JCTDs are more tailored to the needs of battlefield commanders rather than the "technology push" from outside the combat zone, the Pentagon says.
The finalists were selected from more than 100 proposals submitted from the military as well as the defense industry. They may not sound as sexy as ray guns or robot soldiers . But they reflect the saying usually attributed to Napoleon, that "an army marches on its stomach" ... as well as the updated view that armies depend almost as much on their data.
Among the advanced concept technology demonstrations:
- Event Management Framework: An information network system that analyzes data from a wide variety of sources, including the military, federal agencies and other authorities — then comes up with a course of action. This sounds like the kind of antiterror "robo-blog" I talked about almost four years ago, but it could also raise the specter of an Orwellian cyber-spying system .
- Extended Space Sensors Architecture: The Pentagon says this would address "gaps in space situational awareness that increase risk for successful combatant command mission execution." The system would integrate technology from different areas, including national missile defense, to give those combat commanders a better picture of their surroundings.
- Joint Enable Theater Access (JETA): This involves a "Lightweight Modular Causeway System" that would make it easier for ships to unload fighters and equipment in seaports that otherwise couldn't handle the crush.
- Multi-service Advanced Sensors to Counter Obscured Targets (MASCOT): Warfighters would be able to root out concealed threats — conceivably ranging from those roadside bombs to guerrillas in their hidey-holes — by consulting a system that integrates sensor data from a variety of sources.
- Node Management and Deployable Depot (NOMADD): Just as JETA would fast-track deployment from ill-equipped seaports, NOMADD would facilitate a "factory-to-foxhole" distribution system, drawing upon such tools as RFID labeling .
- Small UAV: Unmanned aerial vehicles are nothing new for the U.S. military, which has pioneered the use of big Predator drones as well as tiny robo-planes . But the Pentagon says it wants to develop "new tactics, techniques and procedures across the military services for small-unit, real-time reconnaissance and surveillance capabilities."
The joint capability technology demonstrations include:
- Counterintelligence-Human Intelligence Advanced Modernization Program / Intelligence Operations Now (CHAMPION): This is an information-technology initiative aimed at optimizing the "reporting of critical intelligence-related data in a timely manner, while making data available for analysis." Among the strategies: standardizing data outputs, using XML tags, providing geotagging and enabling Web services.
- Comprehensive Maritime Awareness: This information system would take in information from a variety of sources on potential maritime threats, allowing authorities to focus their resources on the most probable threats.
- Joint Modular Intermodal Distribution System: The military is developing a standardized system of containers and pallets for automated loading, handling, storage, tracking and surveillance.
- Large Data: This project would demonstrate a rapidly deployable, high-security system for sending huge amounts of data — including pictures and video — to and from the battlefield.
To keep up with the latest in military technology, you can always check my colleague Noah Shachtman's blog at Defense Tech.
• March 17, 2006 |
8:30 p.m. ET
SpaceX test set for Saturday: SpaceX spokeswoman Dianne Molina reports that the static-fire test of the company's Falcon 1 rocket has been set for Saturday rather than today. "Timing became an issue to proceed today, and the team wanted to take their time to ensure a successful systems test," she said in an e-mail. If the test is successful, the company would proceed with planning for a launch attempt in the March 20-25 time frame.
• March 17, 2006 |
8 p.m. ET
Debugging the computer industry: I received a ton of e-mail in response to this week's item on the "problem with programming." Some said computer science is losing popularity among students primarily because the field pays so poorly, while others pointed to other factors. Here's a selection of the feedback:
Kevin Cabral: "I read with interest your blog on the state of computer science and developer positions in this country. I would like to broaden the discussion a little and include all of science in this country. We have a Congress and administration that 'supports' science but then reduces the science budgets. Look at NIST, Department of Energy, the Superconducting Supercollider project in Texas, NASA (even with increases in budgets, Congress earmarks the allocation and other departments are forced to close) and other government research positions. It doesn't pay well, nor are there many jobs. Corporations in this country have reduced R&D budgets since the early '80s unless it's defense (very cyclical in nature) or pharmaceuticals. We have systemic problem in this country dealing with science. My advice for a young person is: Get a job in major-league sports or Hollywood — both pay very well."
Scot Larsen: "I read your article and some of the responses you posted at the end. I am not sure what these folks are talking about. I am VP of software development for a Southern California brokerage trading solutions company. We are constantly struggling to find qualified and motivated local programming talent. And I do not mean someone with at least six years of experience. In the past year alone, we have hired roughly half a dozen college grads with a starting salary in the mid- to high $40,000 range. We have developers with six to 10 years of experience making over $100,000. Part of the reason that companies are using overseas developers, beyond cost issues, is that it is difficult to find talented software developers in the U.S. There are a multitude of local candidates floating around that have a high-school diploma and maybe two years of technical college, but most companies will go overseas before talking a chance on someone with relatively little training."
Kelli McLure: "As an IT professional with over 20 years in the field, I tell every student I meet to not go into IT or computer programming. The jobs are going over to India at an ever-increasing rate. There are no entry-level positions left where I work, and every year more of my co-workers are training their Indian replacements. I’ll be surprised if I am still employed five years from now. I guess I'll go into teaching!"
Lee: "... If someone who has a computer science degree is making $30,000 a year, I suggest three things: Quit that job, move to another area, take a class or two on whatever the employers are looking for. Java is exploding in the Midwest right now, and contractors would be able to place you in no time."
Theo Hughes: "I had been a programmer for over 10 years when the dot-com crash happened in 2001. I was abruptly laid off in February 2002 and spent the next 3 years looking for work. I have work now at less than half what I used to make, but it was hard finding anything for a person with my experience and qualifications. I was either 'overqualified' (read: too old, too much experience, and way too much just to pay any bills) or 'underqualified' (when I tried to break out of programming). I finally resorted to omitting programming from my work experience just to get a $10-an-hour job. I’m back in it now, but I wouldn’t recommend it to anybody as a career ... and wish I had some way out. There are thousands like me out here, and I don’t think we are going to encourage anybody’s children until the H1B visas stop and American businesses realize that 'farming it out' is fine until you need something fixed or changed — that’s when they usually get into trouble."
Gustavo Keener: "It’s funny you should ask the question of 'How can we get more students interested in programming?' Yesterday, I was helping out a student studying C# at a local technical college. The programming assignment he was given was so spoon-fed that I was thinking, 'Man, do you even go to class!?' All the logic was there, minus a few tidbits of implementation that my friend had to come up with. His main problem was that he lacked the simple algebraic concept of substitution to understand which methods should be used where, or how, for that matter. If a school wants to find potential programmers, they really need to be focusing on people that can pass algebra with flying colors. If a person doesn’t understand the basics of functions, variables, and substitution, they’re not going to have an easy time trying to get into programming."
Philip Budzik,Alexandria, Va.: "One can spend a lot of time building a very efficient and elegant coding solution or build a 'rube-goldberg' that barely functions properly. The problem is that it is unlikely anyone will notice the difference. A painting can be immediately appreciated even if the viewer has no talent for painting. Software code, when it is well-written, is only appreciated by, at best, a few people. So it is frustrating to work hard to do an excellent coding job when a mediocre coding job is appreciated as much an an excellent job. In other words, a programmer really working hard to do the best job possible is likely to be unappreciated and come to feel that way."
Robert: "I am 29 years old and have been programming computers since the late '80s. ... I considered computer science as a major in college. But as a senior at SIU studying electronics, I avoided the computer science major for two main reasons. Computer science majors make little money to begin with, unless they get a master's or doctoral degree, and the second issue is that as a computer science major it's great to know how to design heuristic data structures, but I still never learned how to display "Hello World" inside a dialog box and work with a graphical user interface. What's the point? I love to program, and if I could make a living programming and know with some security that I could get a job very shortly after college, I might have chosen programming. The worst thing is there are high school students who can do more with no formal training. Let them take low-wage programming jobs. They don't have student loans to pay off."
Byron: "Programming is the process of automating human tasks. The ultimate goal of any programmer is to code himself out of a job. As computers have become more sophisticated, the level of tasks that can be automated has become less and less mundane. Every computer generation, or cycle, has a number of tasks that can be automated. Today's cycle, to put it in a rather unsophisticated way, is getting more and people communicating with each other through the Web. Once these tasks have been automated, there is no further need for programmers, except perhaps to fix bugs. Until the next generation comes along, and then they are in sudden need again. ..."
Kris: "I must be an anomaly. I have been working as a software engineer/developer since 2000 without the benefit of a four-year degree and I have been making decent money at it (currently just under $100,000 ayear). I was able to find work even during the hardships of the dot-com blow-up. I believe, though, that the perceived shortage of qualified computer programmers in the U.S. is a much more complicated issue than stated, though:
"(1) In the late 1990’s, due to the rise of the dot-com industry, many people went into computer science that probably wouldn’t have considered it previously. The news media played up the playboy lifestyle that seemed prevalent in many software development houses and a lot of people wanted to be a part of that. I remember working in a lab at the university and hearing a would-be computer science major asking “What is source code? What is the executable?” Now that the luster is gone, a lot of those people who were aiming for easy work and high salaries feel disenfranchised. The work that is left seems humdrum and tedious and will only appeal to those people who actually enjoy programming.
"(2) You have to have a passion for programmatic problem-solving — otherwise you’re likely to give up. I fell in love with computers when the advent of the Apple and I spend just as much time learning new languages and paradigms for fun as I do for work. You cannot expect that what you learned in college will be sufficient to sustain you throughout your career.
"(3) H1B visas: Yes, there is definitely a chance that your job will be outsourced. But there are very few jobs that are impervious to foreign competition. Accounting, medical testing and manufacturing. Unless you are in a service-related industry or construction, I think your job is at risk, too.
"(4) I think some of this has to do with leftover jock/nerd mentality from high school. There is an IBM commercial that totally exemplifies this, where a management team is out on a lake and they have to decide which person to throw off. They initially vote the IT guy and then find out their document room has flooded. The IT guy steps up and saves the day after the other clueless department people start to panic. Yet who do they toss in the lake? The IT guy, of course, because he’s geeky and awkward.
"(5) Computers are so pervasive now that virtually every professional has a little experience using them and thus have created 'hybrid' jobs (business + IT) Many of the M.B.A. people I have worked with can write simple Excel automation.
"(6) The people that seem really apt to doing computer science tend not to have very good social networking skills and/or feel that resumes are beneath them. I don’t know why, but I swear it’s the truth. Many of these people languish in low-paying tech jobs despite the fact that there are better-paying jobs out there that are within their grasp.
"That came out as a jumbled mess, but the point is that if you really enjoy programming and are driven, you can still find a good programming job here in the United States. Furthermore, I think that the future outlook is still strong and that the demand for computer jobs will continue to grow."
• March 17, 2006 |
8 p.m. ET
Weekend field trips on the World Wide Web:
• Technology Review: 10 emerging technologies
• Discovery Channel: 'Perfect Disaster'
• 'Nova' on PBS: 'America's Stone Age Explorers'
• New Scientist: Earth rocks could have taken life to Titan
• The Economist: A Moore's Law for razor blades?
• Panda's Thumb: Life on Mars and intelligent design
In today's image advisory, the Spitzer team reports that the clouds contain polycyclic aromatic hydrocarbons, or PAHs, which can be found on Earth in tailpipes, barbecue pits and other places where combustion has occurred. Astrobiologists believe PAHs could serve as the building blocks for primordial life.
Even before Spitzer focused on the Cigar, astronomers knew that the galaxy was going through a middle-age crisis of sorts. Gravitational interaction with a larger galaxy called Messier 81 has caused a resurgence of starburst in M82, which is 12 million light-years away in the constellation Ursa Major. (This picture from NASA's Galex mission shows the two galaxies in the same stunning frame.)
In the past, other researchers have picked up on cone-shaped emissions of very hot gas coming from the Cigar's core — so Spitzer's astronomers suspected that there would emissions of dusty exhaust as well.
"Usually you see smoke before the fire, but we knew about the fire in this galaxy before Spitzer's infrared eyes saw the smoke," said David Leisawitz, Spitzer program scientist at NASA Headquarters in Washington.
The clouds can hardly be seen in the visible light spectrum, but in the infrared, the Cigar's smoke stands out as the biggest galactic clouds ever spotted by Spitzer. The clouds' wide extent seems to hint that the exhaust is being emitted not just by stars at the core, but by stars throughout the galaxy. That's not what astronomers expected to see.
"Spitzer showed us a dust halo all around this galaxy," said Charles Engelbracht of the University of Arizona at Tucson. "We still don't understand why the dust is all over the place and not cone-shaped."
The full details are to be published in an upcoming issue of the Astrophysical Journal. In addition to Leisawitz and Engelbracht, the authors include Praveen Kundurthy and Karl Gordon of the University of Arizona. The image was taken as part of the Spitzer Infrared Nearby Galaxy Survey, led by the University of Arizona's Robert Kennicutt.
To learn more about Spitzer and infrared astronomy, check out the Web site for the Spitzer mission, this archived story on Spitzer's debut (with a cool picture of M81 and a slideshow), and our interactive graphic on the electromagnetic spectrum.
• March 16, 2006 |
8 p.m. ET
Scientific smorgasbord on the Web:
• The New Yorker: How to dodge the global warming issue
• BBC: Pentagon plans cyber-insect army
• NASASpaceFlight.com: X Prize's Ansari aims for space station
• Space.com: The science behind the spokes in Saturn's rings
• March 15, 2006 |
8:15 p.m. ET
Starry river discovered: Astronomers have detected a narrow, incredibly faint stream of stars stretching halfway across the northern sky — a "river of stars" that could provide new insights into the mysterious dark matter in our own Milky Way galaxy.
The river is actually a tidal tail, pulled loose from a cluster of about 50,000 stars known as NGC 5466 in the constellation Bootes. The star cluster traces an elliptical orbit around the Milky Way's center. During each close approach to the center, the gravitational pull stretches the cluster and strips away stars, said Carl Grillmair of the California Institute of Technology's Spitzer Science Center.
"We were blown away by just how long this thing is," Grillmair said in a Caltech news release. "As one end of the stream clears the horizon this evening, the other will already be halfway up the sky."
The stream is far too faint to be seen with the naked eye. In fact, until now, the stream couldn't be distinguished from the masses of other stars in the Milky Way's disk. Grillmail and Johnson found the stream by analyzing the colors and brightnesses of more than 9 million stars in the Sloan Digital Sky Survey's public database.
"It turns out that, because they were all born at the same time and are situated at roughly the same distance, the stars in globular clusters have a fairly unique signature when you look at how their colors and brightnesses are distributed," Grillmair said.
The astronomers found about 1,000 stretched-out stars that bore the signature of the parent star cluster, but Grillmair told me there may be tens of thousands of other stars in the stream that are just too faint to be spotted by the sky survey. In fact, the stream may well stretch across the whole sky, he said.
Eventually, NGC 5466 could dissolve into a super-stretched-out stream, though Grillmair suspects that the cluster will last for "several more orbits" around the center.
The shape of the stream serves as a fossil record of the gravitational interactions, he said. As explained in today's news release, the lost stars that find themselves between the cluster and the galactic center begin to move slowly ahead of the cluster in its orbit, while the stars that drift away from the galactic center fall slowly behind.
"You can measure the energy as a function of radius," Grillmair observed. Thus, variations in the gravitational field over time would affect the shape of the stream's arc. For instance, if there are concentrated chunks of dark matter in the halo surrounding the Milky Way, the arc should take on something of a serpentine shape, he said.
However, NGC 5466's tidal stream — and another, even longer tidal tail that has been detected just recently — turn out to be "amazingly smooth," Grillmair said.
"That argues that there are not gigantic masses out there, so the dark matter would have to be evenly distributed," he said. Or it could be that the chunks of dark matter are located farther out, resulting in much less of a gravitational effect. In either event, more analysis of tidal tails could eventually tell the tale.
• March 15, 2006 |
8:15 p.m. ET
More food for thought on the World Wide Web:
• Defense Tech: The enemy is me
• Science @ NASA: Quake-proof housing on the moon
• Slate: My eyeball just fell out of its socket
• Download the latest issue of Sub Rosa magazine
My story on Carnegie Mellon University's Alice programming tutorial focused on efforts to get more students interested in computer science by making coding less frustrating and more fun. Educators say that just might reverse a trend indicating that the field is drawing less interest from young people.
But there's an economic side to the issue as well: It's no secret that computer careers have lost some of their luster , due to international outsourcing, dot-com reversals and other downsides. Many of those disillusioned information-technology workers would take cold comfort from today's report that IT employment has risen nearly 1.8 percent over the past year (PDF file).
The disillusionment came through loud and clear in a couple of messages I received in response to the story about Alice:
Steve Bowen: "I just read your article about the Alice software that is designed to encourage more American graduates to become computer programmers. What I found most interesting is the part you left out. Students have seen the landscape of future job opportunities, and those opportunities are no longer desirable to those who have the skills programming needs. Corporations have besieged Congress every year to increase the number of H1B visas, and Congress has been willing accomplices, to hire foreigners, especially from India, to work in this country for about 60 to 70 percent of the going rate for American programmers. That seems to leave the door open only to those students whose best hopes for a job lie in expected salaries of $30,000 or less upon graduation. In other words, those who are not as qualified.
"I think you got the story all wrong. American students are not lacking in skill or desire. They are simply smart enough to see that computer programming job opportunities are becoming more limited every year. And in that, they see the potential of a wasted educational opportunity and make better choices."
Karl: "I read your article, and I don’t understand why so many people are so clueless. That huge drop off has nothing to do with difficulty, it has to do with jobs. I absolutely love programming and I am good at it. But here I am, working tech support for below $30,000 a year. The plain truth is, companies do not want to hire a programmer without half a dozen years experience, and guess what? You can’t get that unless you are hired, and you do not get that working tech support. Unfortunately for me I have a B.S. in computer science, with an underscore to 'B.S.' Most people now are wising up and jumping a sinking ship. Why would anyone major in computer science/IT/tech when they will work twice as hard as anyone else and get paid half as much? You see companies like Lockheed Martin offering high-school students free tuition to college for internships there, but never look at the already-crowded pool of tech majors finding different jobs or going back to school for a different degree. My generation (tech grads 1999 and 2002) is the lost generation. And CMU is full of foreigners getting free tuition so they can go back from where they came from to start their own enterprises."
So I guess this issue touched a nerve — and sometimes, that's what having a blog is all about. Do you think getting more young people interested in computer science is addressing the wrong side of the problem? Let me know what you think, and I'll pass along a selection of the more thoughtful responses.
• March 14, 2006 |
9 p.m. ET
Back into the pool: Are you still mulling over your entry in the NCAA basketball office pool? In the wake of Monday's item on the "science of office pools," University of Minnesota biostatistician Bradley Carlin shared his perspective on picking a winning combination in the 63-game bracket.
As I mentioned on Monday, statistical formulas won't give you much of an edge if your pool follows the standard "just-pick-the-winners" formula. "There's not much for me to exploit," Carlin told me. However, he added, "If the poolmaster rewards you for doing something other than that — for example, picking upsets — then that's the kind of situation I can exploit."
That's how he won his own office pool three times out of five. Carlin used computer analysis, such as that offered by the Poologic Web site, to figure out which upsets could yield the most points. And he suspects that's why his poolmates disbanded the old pool and set up a new one that wasn't so heavy on the incentives for picking upsets.
"I've struggled since then," he said.
Yet another strategy involves making a "contrarian" pick: Instead of going with UConn or Duke , select a less heavily favored contender to win it all, such as Villanova or Memphis — then build your bracket around that outcome. The perceived probability of winning may be lower, but if you guess correctly, you'll get more of the glory and the points.
The bottom line still stands: Let the bettor beware. "I have yet to triumph with this new style of contrarian thinking," Carlin said.
As a postscript, you might be wondering why a biostatistician like Carlin is concerning himself with NCAA basketball. You might even be wondering exactly what a biostatistician does. Carlin explained that most of his work involves studying statistical models for epidemiology and the distribution of natural phenomena. It turns out that some of the data sets describing the spread of an infectious disease apply to "March Madness" as well. Suddenly, it all makes sense.
• March 14, 2006 |
9 p.m. ET
Your daily dose of science on the World Wide Web:
• N.Y. Times (reg. req.): Far out ... but is it quantum physics?
• Wired.com: Pushing the Internet into space
• Transterrestrial Musings: Elevator to nowhere?
• Red Herring: Scientists model entire virus (via Slashdot)
• March 13, 2006 |
4:30 p.m. ET
The science of office pools: Now that the NCAA has released its matchups for the "March Madness" college basketball tourney, it's prime time for practitioners of "bracketology," the art and science of figuring out how to fill in all those photocopied bracket diagrams for office betting pools.
There's certainly some "mathness" to the method of formulating the tournament itself, which can produce 9.2 quintillion possible outcomes. But is there a way to game the system scientifically?
Some would say not. In fact, one of the top-ten tips from MSNBC's own expert, Michael Ventre, is to beware of geeks bearing graphs. That advice might hold you in good stead if it's a simple exercise of who can guess more of the winners. Your best strategy in that case might be to go with all the higher-seeded teams. But if there's an extra reward for picking upsets, a little statistical analysis just might give you a bit of an edge. At least that's the claim made in research going back a decade.
The seminal works in the field include "March Madness? Strategic Behavior in NCAA Basketball Tournament Betting Pools," Andrew Metrick's study in the August 1996 issue of the Journal of Economic Organization and Behavior; and "How to Play Office Pools If You Must," written by David Breiter and Bradley Carlin and published in the Winter 1997 issue of the journal Chance.
Those studies are just the start: Online, you can peruse academic works such as "March Madness and the Office Pool" from the Yale School of Management (PDF file) and "Optimal Strategies for Sports Betting Pools" from Saint Louis University (PDF file). The research tends to go into how Las Vegas point spreads and Monte Carlo simulations can be tweaked to identify the best bets for upsets. And that's where the edge comes in.
"In large pools there can be a significant advantage to picking upsets that differentiate your picks from the crowd," Saint Louis University's Bryan Clair and David Letscher explain in "Optimal Strategies." They even claim that using such strategies can outperform garden-variety picks by an order of magnitude.
One of the strategies is not to give in to an overperception of favorites. "This year, people are going to overbet UConn, and Duke. They're doing that already," said systems analyst Tom Adams, who has distilled all the math into an arsenal of Java-based tools on his Poologic Web site.
Even if you can't tell the difference between UConn and the Yukon, Poologic will help you fill out a bracket passable enough to keep you from becoming the pool's laughingstock. And who knows? You might become successful enough to inspire some geek envy.
Adams said that's what happened to Carlin, the University of Minnesota statistician who co-wrote "How to Play Office Pools If You Must."
"After winning three out of five years, Brad Carlin, the guy who came up with this, was basically kicked out of his pool," Adams said. "I think that may be a hazard of this."
But let the bettor beware: Over the past seven years or so, Adams himself hasn't yet parlayed the statistical tricks into office-pool profits. "I'm a little bit behind," he told me, "probably $20 in the hole. ... Even if you have an edge, you can get in the hole."
• March 13, 2006 |
4:30 p.m. ET
Explore Mars on your desktop: Google's latest mapping project offers zoomable views of the Red Planet in three flavors, including a rainbow-hued elevation view. Google Mars, the follow-up to its Earth maps and moon sampler, makes use of high-resolution visible-wavelength imagery and laser altimeter data from NASA's Mars Global Surveyor orbiter, as well as infrared views from the thermal imager aboard the Mars Odyssey orbiter.
NASA's own mapping tool, World Wind, is being upgraded to provide Martian imagery as well, drawing upon the Mars Digital Imaging Model — which in turn is based on Viking orbiter data. Version 1.3.4, which of course also provides Earth and moon imagery, is due for release in about two weeks, says NASA's Patrick Hogan.
Video: Grandest canyon But wait ... there's more: NASA's Jet Propulsion Laboratory and Arizona State University, which is in charge of Mars Odyssey's thermal imager, is releasing a simulated fly-through of Valles Marineris, the Red Planet's largest canyon, as a three-minute Web movie titled "Flight Into Mariner Valley." The special effects include Martian dust devils and comparisons with Earth features, and the whole thing is narrated by ASU planetary geologist Phil Christensen. I can hardly wait for the sequel ...
• March 13, 2006 |
4:30 p.m. ET
Science and strangeness on the World Wide Web:
• PhysOrg: The physics of friendship
• Discovery.com: Why birds blush
• Scientific American: A crater jumper for the moon
• Cosmeo: Homework help from the Discovery Channel
• NepalNews.com: 'Little Buddha' goes missing
• Journal of Irreproducible Results: Funniest-graph contest
Looking for older items? Check the Cosmic Log archive. Share your perspective on cosmic subjects with Alan Boyle. If you link to this page, you can use http://cosmiclog.msnbc.com or http://www.cosmiclog.com as the address. MSNBC is not responsible for the content of Internet links.