Latest Technology: NEWS SCIENCE ENGINEERING DEVELOPMENTS/INVANTIONS

Wednesday 28 September 2011

World’s first flying car prepares for take-off


Is it a car? Is it a plane? Actually it’s both. The first flying automobile, equally at home in the sky or on the road, is scheduled to take to the air next month.
If it survives its first test flight, the Terrafugia Transition, which can transform itself from a two-seater road car to a plane in 15 seconds, is expected to land in showrooms in about 18 months’ time.
Its manufacturer says it is easy to keep and run since it uses normal unleaded fuel and will fit into a garage.

Carl Dietrich, who runs the Massachusetts-based Terrafugia, said: “This is the first really integrated design where the wings fold up automatically and all the parts are in one vehicle.”

he Transition, developed by former Nasa engineers, is powered by the same 100bhp engine on the ground and in the air.
Terrafugia claims it will be able to fly up to 500 miles on a single tank of petrol at a cruising speed of 115mph. Up to now, however, it has been tested only on roads at up to 90mph.
Dietrich said he had already received 40 orders, despite an expected retail price of $200,000 (£132,000).
“For an airplane that’s very reasonable, but for a car that’s very much at the high end,” he conceded.
There are still one or two drawbacks. Getting insurance may be a little tricky and finding somewhere to take off may not be straightforward: the only place in the US in which it is legal to take off from a road is Alaska.
Dietrich is optimistic. He said: “In the long term we have the potential to make air travel practical for individuals at a price that would meet or beat driving, with huge time savings.”

Friday 23 September 2011

Portable Gaming Devices

Just before the turn of the millennium, portable gaming was booming. Most people that had a portable gaming device sported a Gameboy, and the really lucky ones had a Gameboy Color. There was always that one rich kid at Pizza Hut, too, with the Sega Gamegear that people couldn’t stop talking about, and Pokemon was taking the world by storm.
At night, you would fall asleep with images of the Atari Lynx swimming around in your head and hope that one day you could afford one. 10 years later, you’re glad that your parents never sprung for one of them.
Not long after that part of your life, the portable gaming world changed dramatically. With their staunch lead, Nintendo started to be more experimental and it paid off. The Gameboy Advance put quality graphics that were slightly above Super Nintendo, and sound in the palm of your hand, and further still the Nintendo DS (a few years later) managed to squeeze out post-N64 quality gaming with a touch-screen, and opened up a whole new world of gaming possibilities.
What’s more impressive is that, for once, Nintendo had some serious competition in the portable gaming racket. Sony took their powerful juggernaught, the Playstation 2, and compressed it into a comfortable, handheld console that demanded respect. Graphics and sound quality that Sony was notorious for could now be enjoyed, portably, on a beautiful 4.3″ screen.
If only you knew, back in your room 10 years ago, that things were about to change, and that one day you would forget all about the Atari Lynx… until you read about it in a Listverse article.
9

Solid State Data Storage

In 1999, if you had an 8 gigabyte (gb) hard drive, you were the cool kid on the block. “What can you possibly fill that whole thing up with?” your friends would ask you.
Computer games you bought at the store fit on a single CD-ROM, and everyone knew that you had to wait for your hard drive to spin up before each level. Those of us who were especially caring of our expensive, magnetic, spindled drums of data would even run Scandisk and Defrag on them (which would take hours, of course). Things were looking up, too, as hard drive experts predicted that in the year 2000, 30gb hard drives could be as cheap as $200.
Now, imagine yourself waking up in the middle of the night because someone outside your window is throwing pebbles on it. When you open your window, you notice that they look exactly like you, only about 10 years older. They tell you not to worry, because in a single decade hard drives will be ridiculously smaller, lack any moving parts, be practically weightless and can withstand far more brutal environments.
Oh, and it’s far cheaper, too. You know that 8gb hard drive you just spent $150 on? You can get one that fits in your coin pocket for $15, down the street.

Interactive map of the internet's underwater paths

Ever wondered how your email can cross the vastness of the ocean and be delivered almost instantly, anywhere in the world? It's all down to a network of fibre-optic cables that link up the continents and transmit terabits of data every second.
Thanks to TeleGeography, a US telecommunications research firm, you can now view these submarine cables on an interactive map and get a sense of the physical infrastructure that keeps the internet going.
The map shows 188 active and planned submarine cables, along with their landing points. Clicking a cable gives you more information, such as its name, its length, who owns it and where it meets land. Clicking a landing point will also tell you which cables terminate at that location.
The map is only a stylised representation, so the real cables and landing points may lie in slightly different locations. That should protect cables from thieves, who have caused communications outages in south-east Asia, but it won't help ships avoid breaking cables by dropping anchor.

Wednesday 21 September 2011

Sprint files suit to block AT&T's T-Mobile merger


 Add Sprint to the list of big names choosing to speak now rather than hold their peace on AT&T and T-Mobile's dreams of corporate matrimony.
Saying it would violate the Clayton Anti-Trust Act, Sprint today filed a lawsuit opposing the deal before the same Washington, D.C.-based federal judge who received a related suit filed last week by the Department of Justice.
"Sprint opposes AT&T's proposed takeover of T-Mobile," said Sprint Vice President of Litigation Susan Haller in a statement. "With today's legal action, we are continuing that advocacy on behalf of consumers and competition, and expect to contribute our expertise and resources in proving that the proposed transaction is illegal."
In a copy of the complaint obtained by CNET, Sprint claims that "AT&T's proposed takeover of T-Mobile is brazenly anti-competitive. In one fell swoop, AT&T's proposed purchase would eliminate one of four national competitors and marginalize a second (Sprint), pushing the market back toward a 1980s-style cell phone duopoly that would force consumers to endure higher prices and be denied the fruits of vigorous innovation."
"Verizon, AT&T's most significant competitor post-merger, would not have the incentive to constrain AT&T, and would have a substantially increased incentive to coordinate with AT&T rather than compete."
--Sprint in today's complaint
Sprint claims that AT&T's $39 billion deal to acquire T-Mobile from Deutsche Telekom would also lead to "higher prices" and harm Sprint and other small wireless carriers because a much bigger AT&T would have increased "control over backhaul, roaming and spectrum, and its increased market position to exclude competitors, raise their costs, restrict their access to handsets, damage their businesses and ultimately to lessen competition."
AT&T has argued that the deal will mean more efficient and improved services, particularly in rural America, and the company also pledged to bring 5,000 call center jobs back to the United States if the acquisition goes through. If it doesn't, AT&T could still have to pony up a $3 billion kill fee to T-Mobile.
In response to Sprint's suit, an AT&T spokesman today said it demonstrates what AT&T has been saying all along. "Sprint is more interested in protecting itself than it is in promoting competition that benefits consumers."
The spokesman added that AT&T "will vigorously contest this matter in court as AT&T's merger with T-Mobile USA will: help solve our nation's spectrum exhaust situation and improve wireless service for millions; allow AT&T to expand 4G LTE mobile broadband to another 55 million Americans, or 97 percent of the population; and result in billions of additional investment and tens of thousands of jobs, at a time when our nation needs them most."
Sprint, however, argues in the suit that the deal would essentially neuter the entire wireless market, not only by marginalizing Sprint itself, but also by removing T-Mobile, which it calls "a low price and innovative maverick competitor that provides particularly disruptive competition in the marketplace."
The complaint goes on to posit a new possible consequence of the acquisition with regards to its impact on Verizon, currently the largest wireless carrier in the U.S.: "Verizon, AT&T's most significant competitor post-merger, would not have the incentive to constrain AT&T, and would have a substantially increased incentive to coordinate with AT&T rather than compete."

Arizona company drops iCloud suit, changes name


An Arizona company that sued Apple over its use of the iCloud name seems to have changed its tune--and its name.
iCloud Communications alleged in a lawsuit filed in June that the name of Apple's online storage service copied its name and caused confusion over competing products. The lawsuit sought an injunction against Apple's use of the iCloud name, as well as an unspecified amount of monetary compensation.
The company went so far as to say that "Apple has a long and well-known history of knowingly and willfully treading on the trademark rights of others." But the Phoenix-based voice over IP provider on Thursday filed a notice of voluntary dismissal with the U.S. District Court of Arizona that precludes the claim from being refiled:
 Please take notice that, pursuant to Rule 41(a)(1)(A)(i) of the Federal Rules of Civil Procedure, Plaintiff I Cloud Communications, LLC dismisses its claims against Defendant Apple Inc., with prejudice and without costs or attorneys' fees to either party.
In another development, the company appears to have changed its name to Clear Digital Communications. A Facebook page for a company called Clear Digital Communications lists the same address and contact information as an iCloud Communications Facebook page, as well as a wall post from August 12 that says, "iCloud is now Clear Digital Communications." The Clear Digital Communications page also includes a profile picture with the name iCloud Communications.

Groupon postponing IPO over market chaos?


Daily-deals provider Groupon has decided to postpone its initial public offering, The Wall Street Journal is reporting, citing anonymous sources.
The Journal's sources claim that Groupon's management has decided against an IPO anytime soon due to the stock market's continued "volatility." Initially, the Journal's sources claim, Groupon was planning to price its shares during the middle of September and go public soon thereafter.
Groupon filed for its IPO with the U.S. Securities and Exchange Commission in June. The $750 million IPO could value the company at a reported $20 billion to $25 billion, depending on the number of shares it would eventually offer.
When Groupon announced plans to go public, the stock market appeared welcoming for companies hoping to score big with an IPO. In May, LinkedIn saw its shares soar 109 percent in its first day of trading. Later that month, Yandex shares were offered on the Nasdaq, and they closed the day at $37.75, up from their initial price of $25.
However, the market has been hit hard over the last few months. With the financial crisis in Europe continuing to worsen, and economic and political issues in the United States prompting the Standard & Poors rating agency to downgrade the U.S. debt rating from AAA to AA+, the markets swung wildly. Such volatility, which continues to linger, is fine for savvy investors, but companies looking to go public are finding a difficult environment in which to initially offer their shares.

FSF's Star Turn in the Android FUDathon, Part 1

FSF's Star Turn in the Android FUDathon, Part 1

My first thought was that someone was engaging in click-bait journalism. Even the title of the post -- "Android GPLv2 termination worries - one more reason to upgrade to GPLv3" -- is something I would expect from anti-Android trolls, not the Free Software Foundation.

The conclusion at the bottom of the article, that companies using Android should urge Linux developers to switch to the GPLv3, is so bad it's not even wrong. It betrays a singular unawareness of the mobile market that Android serves.

Mobile phone manufacturers don't make different silicon for each market -- instead, they customize the software so that the phone can be type-approved by regulators and carriers in each country individually. Things like maximum transmitter output, radio channels, and how the device interacts with the cellular network all need to be customizable, and the device needs to be tamper-resistant.

A GPLv3 Android phone, with all the decryption keys available to any user on demand, is a non-starter. No manufacturer will make such an insecure-by-design device. No telco would put the stability of its network at such risk. No informed consumer would want one.

So what about those GPLv2 "permanent" terminations?



A New License Is Only a Download Away

"Take-it-or-leave-it" licenses like the GPL are a form of contract known variously as a "contract of adhesion," "boilerplate contract" or "standard form contract." As such, they are subject to special rules that require any ambiguities to always be resolved in favor of the recipient (contra proferentem). This "you made your bed, you sleep in it" approach is the same one we learned when we were kids -- whoever cuts the birthday cake can't complain about getting a smaller slice.

Contrary to the article's claim of "permanent termination" for violating the GPLv2 license, it's very easy to get a new license to resume distribution of a GPLv2 program. Just download or otherwise get a new copy, as per section 6 of the GPLv2, and you automatically receive a new license grant, which is valid for as long as you remain in compliance.

While this doesn't "whitewash" any problems that arose under the old license grant, it's clear that the new license cannot have additional restrictions, such as a past license termination, imposed on it.

6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. (emphasis added)
What does contra proferentum mean for entities that had a GPLv2 license instance terminated? Among other things, if they return to compliance, they have every right to rely on the automatic license grant provisions of section 6 of the GPL when they obtain a new copy of the program.
The word "permanently" never appears in the license, and any ambiguity as to whether the termination of a previous license under section 4 prohibits them from getting a new license must be resolved in their favor. Not that there's much room for ambiguity -- "Each time ... receives a license" makes it clear that every copy comes with its own license instance.

Quick Summary for the TL;DR Set

While it is true that section 4 of the GPLv2 license terminates your right to redistribute when you fall out of compliance, section 6 is equally clear when it states that you get a valid license from the copyright-holder with each new copy you receive. Resuming distribution is simply a matter of returning to compliance and downloading a new copy.

It's true that this won't "fix" previous compliance problems; depending on their nature, they may have to be negotiated with the copyright-holders or decided by a court, but the threat of the ultimate "big stick" -- of never being able to resume distribution with the new license automatically granted under section 6 -- is an attempt to impose restrictions that neither a plain reading of the license nor the rules dealing with take-it-or-leave-it contracts allows.

Speedtest Won't Fix Your Poky Connection, but It Sure Is Nice to Know

Speedtest.net Mobile Speed Test
For the most part, I barely notice the incoming speed of my Internet data connections on my iPhone 4 or iPad 2. Sure, if I want to download something large, I make sure I'm on a WiFi connection. If I'm in a car (riding as a passenger), I'll think twice about attempting to download a bunch of email out of range of an AT&T (NYSE: T) 3G tower.
But sometimes -- usually when I'm streaming a video or really need to get some work done -- it's painfully obvious that the tiny invisible blips of data are not riding the waves very fast at all, for no discernible reason. In fact, I've had poor Netflix (Nasdaq: NFLX) streaming response while using a WiFi connection only to turn off WiFi on my iPhone 4 and stream via AT&T's cellular data service instead -- with much better results.

This used to be sort of trial and error, hit and miss. But now there's an app to help you better understand what sort of Internet data movement performance you can expect: Speedtest.net Mobile Speed Test by Ookla.

This free app works much like the widely and wildly popular desktop browser-based version at Speedtest.net. You start the test, which sends some sort of meaningless download data to your desktop (or in this case, iPhone) while the app measures the speed at which you're able to gobble the data. Then it reverses and uploads a smaller bit of data.

As with most home Internet connections, at least in the U.S., the download speeds are far faster than the upload speeds. I'm not sure where the bottleneck or tech limitations are with this; I just recognize it as a fact of the data plans, most notably seen when a regular consumer is surprised at how long it takes to upload a simple video.



Back to Speedtest.net Mobile Speed Test

The Speedtest.net Mobile Speed Test app uses Ookla's massive global infrastructure to minimize the impact of Internet congestion and latency when it tests your bandwith. I'm not sure what this means, exactly, but I get the impression that Speedtest.net has some brains that decide which servers to connect you to in order to try to get a reasonably accurate measure of your true download/upload speeds.

For example, it wouldn't make a lot of sense to connect you to a small overloaded server in Antarctica that's trying to communicate through a tiny pipe, nor does it make sense to connect you to servers with all sorts of switches and hops in between you and the server. Technically, a blip of data ought to be moving so quickly that thousands of miles mean nothing. But really, what all this means is that you'll likely see the Speedtest.net Mobile Speed Test app connect you to a regional server for your test. The default server chosen in my tests has been from a city about 80 miles away.

In my home, I tend to get my best bandwidth during the morning hours, but as the afternoon wears on, it seems as if my bandwidth falls off a cliff. I'm guessing that every kid in my neighborhood, in the city, in the county, and in the state, et al, either gets home from school and starts playing video games on Xbox Live or starts streaming some kid flick from Netflix. Or maybe it's not the kids, but if I'm thinking about downloading a video to buy on iTunes ... let's just say that I don't usually bother attempting it from 4 p.m. to 8 p.m.

In fact, I've had a roomful of family over during the holidays, and when we all finally agreed on which HD movie to rent on my Apple (Nasdaq: AAPL) TV, we realized that, oops, this puppy will be ready to watch in two hours.

Chronicles of Desktop Deaths Foretold

Chronicles of Desktop Deaths Foretold

Now that September has arrived at last, life has taken on a different tone here in the Northern reaches of the Linux blogosphere.
After all, just around the corner now are crisp and cool days, Halloween, and the crunch of fallen leaves underfoot as nature prepares for its long winter sleep.

It's perhaps no great surprise, then, that many thoughts seem to have turned to death and dying in this season of decay. No longer confined to a few heavily air-conditioned bars and saloons, bloggers have begun to lift their heads and ponder the end of things -- not just in the natural world but in technology as well.

Death All Around

"The end of the OS is nigh," read one headline not long ago, for example.

"Desktop computers changing, not dying" insisted another.

And again: "Desktop: 'The report of my death was an exaggeration,'" read yet another.

There's been a distinctly morbid focus in the Linux blogs lately, in other words, and Linux Girl wanted to learn more.



'The Desktop Is Here to Stay'

"What is 'death' here?" mused Chris Travers, a Slashdot blogger who works on the LedgerSMB project. "It seems to me that what people are saying is not that we won't use these things, but that they won't occupy the central role in our lives that they have in the past. In all cases, we are talking about trends that are exaggerated."

Desktops, for example, "will always be extremely handy forms of computers," Travers told Linux Girl. "Nobody is going to stop using a desktop just because they now have a series of mobile devices. Desktops are too useful in business and at home for that to stop, and they are far less expensive than even laptops of comparable power and reliability."

In other words, "the desktop is here to stay," he asserted.

'The Browser Is More Important'

Same with the OS, Travers added. "While a lot more may run in the browser, that hardly makes the OS less relevant. Something has to provide the base services to the browser."

What's actually happening, then, is that people are simply less attached to the OS, he suggested.

"What the author is actually saying is, 'the browser is more important, so I don't care about the OS anymore,'" Travers concluded.

'My Desktop Has More Power'

"The desktop just isn't trendy anymore," consultant and Slashdot blogger Gerhard Mack agreed. "First the laptop was supposed to kill it and now it's the cell phone and tablet? Bad idea."

Mack himself has a desktop, a laptop, an HTC Desire Z cellphone and a work-provided Galaxy Tab, he told Linux Girl.

"Care to guess which one I use the most?" he asked. "It's the desktop. My desktop has more power than the rest of my devices put together, the keyboard is at the proper typing height, and the monitors are on an ergonomic stand to keep my neck from being strained."

Desktops dominate Mack's workplace as well, he said.

"We would be totally screwed if the desktop went away," Mack concluded, "and I doubt many other offices are different from ours."

'Dying, Not Dead'

Hyperlogos blogger Martin Espinoza didn't dispute the desktop's ultimate demise -- just how soon it would happen.

"The desktop is dying, it's not dead," Espinoza told Linux Girl. "These things don't happen overnight."

People are finally "getting their hands on quad-core phones with HDMI output, so now they have a feasible desktop replacement for the majority of purposes that they carry around in their pocket," he explained. "Since these devices are now in the hands of the public, they may begin to meaningfully supplant the desktop as the primary computer for getting things done."

'Overpriced Toys for Boys'

It's actually laptops that continue to dominate the market, according to Barbara Hudson, a blogger on Slashdot who goes by "Tom" on the site.

"How can they not when the local big-box is selling name-brand 15.6-inch quad-core laptops with 6 gigs of ram and a 750 gig hard drive for (US)$400?" she pointed out. "With this sort of value proposition, laptops are killing desktops and netbooks, as well as giving tablet manufacturers headaches by making tablets look like overpriced toys for boys in comparison."

As for the operating system becoming a commodity, that's just what it's supposed to be, Hudson asserted. "The real question is how long before all applications integrate with the Net seamlessly? Games have been doing it for decades."

Scotland Yard Tightens the Pincers on Anonymous

Scotland Yard Tightens the Pincers on Anonymous
Add caption
It's been another wild and crazy week for the security community.

Scotland Yard arrested two suspected members of Anonymous and LulzSec Thursday.

Meanwhile, the major players in the browser market -- Google (Nasdaq: GOOG), Microsoft (Nasdaq: MSFT) and the Mozilla Foundation -- have chopped Dutch certificate DigiNotar off at the knees, apparently because it was slow to warn that hackers had broken into its network and issued rogue SSL security certificates.

Further, a security researcher released information that hackers could use to leverage Google's massive bandwidth and launch large-scale distributed denial of service (DDoS) and SQL injection attacks.

The Star Wars Galaxies gaming site was also hacked this past week, and the hacker posted the user IDs and passwords of 23,000 of the site's members on the Web.

Finally, a survey by security vendor Veriphyr has found that healthcare organizations are suffering data breaches hand over fist.


Ho, Hackers! The Game's Afoot!

Scotland Yard arrested two suspects in separate counties Thursday, reportedly under suspicion of conducting online attacks under the handle "Kayla."

"Kayla" was allegedly among those behind the February Anonymous intrusions perpetrated on HBGary Federal, a company claiming to provide security to the United States federal government.

The attackers defaced HBGary's website, stole and published 71,000 internal emails from the company, and posted a message denouncing the HBGary.

Lack of Speed Kills

On Monday, Google learned that some users of its encrypted services in Iran suffered attempts at man-in-the-middle attacks, where someone tries to intercept communications between two parties.

The attacker used a fake SSL certificate issued by Dutch root certificate authority DigiNotar.

It seems an intruder had broken into DigiNotar's systems back in July and stolen up to 200 rogue, or fraudulent, SSL certificates, some for major domains.

DigiNotar had known about the breach since July 19 but apparently had not disclosed the information.

In response, Google, Mozilla and Microsoft all revoked trust in the DigiNotar root certificate in their browsers.

"These certificates could be used as part of attacks designed to harvest user Gmail credentials and gain access to sensitive data," Norman Sadeh, cofounder of Wombat Security Technologies, told TechNewsWorld.

Disabling DigiNotar's root certificate authority was justified because "security across the Internet is a shared responsibility and our root certificate authorities must be held to the highest standard," Don DeBolt, director of threat research at Total Defense, told TechNewsWorld.

Google spokesperson Chris Gaither declined comment.

Leveraging Google's Bandwidth for Hacks

A security researcher has disclosed on the IHTeam blog how attackers can use Google's servers to launch a DDoS attack.

Hackers can also use the technique to launch SQL injection attacks, one of the top 10 vectors of attack, according to the tester, who goes by the handle "r00t.ati."

The tester posted the information Monday after Google's security center had failed to respond to a notification of the threat sent Aug. 10.

Google posted a message on the IHTeam blog Friday apologizing and stating it has tweaked its security.

"This is a serious issue, and even if Google fixes these two vulnerable pages, bad actors will likely comb Google's pages from now on looking for a similar vulnerability," Total Defense's DeBolt remarked.

"My understanding is, this is not a software vulnerability, but rather a description of service misuse that we have not seen in practice," Google spokesperson Jay Nancarrow told TechNewsWorld.

Multiple social networking and online translator sites could also be used by hackers to launch attacks in the same way, Nancarrow pointed out.

The Force Isn't Strong With This One

This past week, a hacker broke into the Star Wars Galaxies gaming site, stole the user IDs and passwords of 23,000 members, and posted them on the Internet.

All the passwords are in plain text, the hacker said.

SWGalaxies isn't the only gaming site to have been victimized in recent months. Earlier this year, the Sega website and the Sony (NYSE: SNE) PlayStation Network were hacked, with data on more than 100 million users stolen in each case.

Are game sites more vulnerable than others? Not necessarily, but they often aren't as heavily fortified as, say, banking sites. That needs to change, Todd Feinman, CEO of Identity Finder, told TechNewsWorld.

"Any institution that stores personal information, including a password, should be held to a higher standard and be accountable for loss of sensitive data," Feinman stated.

Healthcare and Privacy

More than 70 percent of respondents to an online survey on privacy breaches concerning protected health information have suffered at least one breach in the past 12 months, according to a study conducted by security vendor Veriphyr.

Hospitals and health systems constituted 52 percent of the 90 respondents, Veriphyr CEO Alan Norquist told TechNewsWorld. Half the responding organizations had more than 1,000 employees.

Big Mango Falls From HTC's Tree

Big Mango Falls From HTC's Tree
HTC has shown off its first two smartphones running Mango, Microsoft's (Nasdaq: MSFT) upcoming update for its Windows Phone mobile operating system.

The new devices, the Titan and the Radar, are being shown to some consumers in London, Paris, Madrid and Berlin.

The Titan has a 4.7-inch display -- much larger than a typical smartphone's display and more than 30 percent larger than the iPhone 4's 3.5-inch screen. The Radar's screen is smaller at 3.8 inches.


At a time when so many smartphones look so much alike, a particularly large screen could make the Titan conspicuous to consumers, if nothing else.

"Every form factor has been tried, and now it looks like manufacturers are going to larger and larger smartphones," Allen Nogee, a research director at In-Stat, told TechNewsWorld. "When almost all smartphones look exactly the same and run the same applications, how do you set your product apart from the crowd?"

HTC did not respond to requests for comment by press time.

Both the Titan and the Radar will be broadly available worldwide from October, starting in Europe and Asia.



Tech Specs for the Mango Devices

The Titan and the Radar both have the standard front and rear cameras and can shoot 720p HD videos. Both have a dedicated hardware camera button that lets users take photographs without having to unlock the phones.

They both come with the HTC Watch video service, which was introduced in April on the then-newly launched HTC Sensation 4G.

Watch is an application and service that provides access to the latest premium movies and TV shows.

Both the Titan and the Radar offer access to Microsoft's Zune music service and have Virtual 5.1 surround sound.

They both have HTML5 support and let owners access Microsoft Xbox Live. Both also provide the usual access to social networking services.

The HTC Titan has a 4.7-inch LCD screen, an ultra-slim 9.9mm curved brushed aluminum shell, and built-in Microsoft Office Mobile. It has a Qualcomm (Nasdaq: QCOM) Snapdragon 1.5GHz processor and is a 3G device.

The HTC Radar is also a 3G smartphone. It has a Qualcomm Snapdragon 1GHz processor.

The Sense-less HTC Smartphones

Notably, neither the Titan nor the Radar has the HTC Sense graphical user interface, which HTC developed for mobile devices running Windows Mobile, Android and Brew.


HTC Radar
That could be an attempt by Microsoft to exert some control over its new mobile phone system.

One of the issues that plagued Windows Mobile, the predecessor of WinPho7, is that each smartphone manufacturer put its own UI on top of the operating system, resulting in a fragmentation of the market and a lack of interoperability among Windows mobile phones.HTC Radar

"I think Microsoft is clamping down on fragmentation," IDC's Stofega opined. "They learned with Windows Mobile 5 and 6 that you need some sort of control lever to make sure people don't do things that are ultimately not good for the operating system."

The Wedding Crashers

The Wedding Crashers

Nobody expected AT&T (NYSE: T) to have an especially easy time convincing regulators to allow it to buy up rival wireless carrier T-Mobile. AT&T announced its intentions last Spring to purchase the fourth-largest U.S. carrier from parent company Deutsche Telekom (NYSE: DT) for US$39 billion, and critics from all corners wasted no time expressing why they thought that would be a very bad idea.

But that's not to say everyone thought it would be impossible. If the prevailing winds of antitrust regulation weren't strong enough to knock Comcast's (Nasdaq: CMCSK) bid for Universal off course, then who's to say AT&T's deal wouldn't eventually fly too?

Now, though, it looks like the proposal has encountered its biggest blow yet, and it may end up crushing the merger completely. The U.S. Department of Justice has filed a civil antitrust suit to block the buyout, claiming such a deal would significantly hurt competition in the U.S. wireless market. If allowed to go through, the purchase would end up hurting consumers through higher prices, diminished service quality, fewer choices and slowed innovation, according to the DoJ.

Just as the suit was announced, the U.S. Federal Communications Commission chimed in with a message of support for the Justice Department's action.

Over the last few months, AT&T has taken every opportunity it could get to convince regulators, watchdogs, consumers, you, me and every other living thing on the planet that the merger was a great idea. Just as the suit was announced, AT&T was busy publicizing a new reason everyone should get behind the deal: jobs. Letting the company buy up T-Mobile would enable it to bring 5,000 outsourced jobs back to the U.S., the company claimed. It's still not clear how many existing U.S. jobs the merger would have eliminated, though.

VirtualBox: A Clean Sandbox for Your Linux Desktop

Virtualbox
Add caption
Constantly testing software and tinkering with a variety of Linux operating systems puts my multiple test-bench computers to constant use. Granted, Linux comes with a lot fewer security risks. But dealing with unknown factors and beta glitches can be time consuming to correct when they take down an entire box.

A much safer and quicker way to deal with such potential harm is to spare the physical machines and run the new stuff in a virtual machine instead. Oracle's (Nasdaq: ORCL) VM VirtualBox 4.0 is a handy app for doing just that. It runs nicely in a variety of Linux distros.

Oracle's VM software is not the only choice for running other OSes inside a particular Linux distro. But it is one of the easiest solutions. For example, you can install Parallels, Qemu, KVM (Kernel's Virtualization Machine) and VMWare.

These options are not an equal fit. Each one has its own strengths and weaknesses. But if your goal is to run something else without having to shut down your currently running OS and dual boot or turn on a second computer, VirtualBox is a very solid option to choose.


Virtual Info

VirtualBox runs on both x86 and AMD64/Intel64 systems and is suitable for enterprise or home use. Enterprise users will appreciate its feature-rich, high-quality performance. Home users will recognize its simplicity to use without poring over documentation.

Besides being Linux-friendly, Oracle's VirtualBox runs on Windows, Macintosh and Solaris hosts and supports a large number of guest operating systems including, but not limited to, Windows (NT 4.0, 2000, XP, Server 2003, Vista, Windows 7), DOS/Windows 3.x, Linux (2.4 and 2.6), Solaris and OpenSolaris, OS/2, and OpenBSD.

It is a general-purpose full virtualizer that targets server, desktop and embedded use. VirtualBox 4.0.8 was released on May 16 as a maintenance upgrade. Version 4.0, released last December, was a big step forward in adding features to VirtualBox.

The current 4.0 line introduced a two-fold product distribution to better address user needs. The base package is a fully functional release. Extension Packs add further feature refinements that provide more specialized solutions.

Download and Get Going

I have used earlier versions of VirtualBox on other versions of the Ubuntu Linux OS. I put this latest version through its paces on Ubuntu 11.04 running the classic rather than Unity desktop option.

Normally, I install software packages through the Ubuntu Software Center when it is available. VM VirtualBox is available through Ubuntu's repository.

But I wanted to ensure that I had the latest version, so I downloaded the .deb package from the the virtualbox.org wiki site noted above. Not having to unarchive the package or compile it was much appreciated. It installed through the package manager with no issues.

The Setup Scenario

Much like using word processors or spreadsheets, virtual machine apps are very similar in their look and feel. You follow the same basic process to run a VM app within a host OS.

For instance, you first have to create the virtual environment and set certain parameters. The virtual machine must share processor and memory resources with the host.

If you over-tax the VM app, the host's performance suffers. The goal is to be able to run programs or processes within the isolated VM environment while still being able to run the host's applications. So it becomes a balancing act.

After setting up the virtual machine to run inside the physical computer, you then install one or more OSes to run inside the VM. VirtualBox does this very handily by suing two separate wizards.

Get Started

The setup process is very straightforward with VirtualBox. Once you have the first run process completed, all you have to do is run the app and click on your installed options. Here is how it works.

First, press the New button in the main tool bar at the upper left of the VirtualBox window. This loads the New Virtual Machine Wizard. Then press the Next button at the bottom of the window.

Now enter a name for the VM session and select the OS Type from the drop-down windows. The choices are MS Windows, Linux, Solaris, BSD, IBM (NYSE: IBM) OS/2, Mac OS X or other.

Next, select the amount of base memory (RAM) in megabytes you want to allocate to the virtual machine. The recommended amount is 256MB, which is the default setting on the slide bar. Then click Next.

Finally, select the type of Virtual Hard Disk the virtual machine will use as the boot hard disk. You have two choices: create a new one or use an existing one. The default is create new. Then click Next again.

Moving On

The second part of the first use process is creating the virtual disk on your physical system. The VirtualBox window now displays the Create New Virtual Disk Wizard window. Start it by clicking the Next button.

Now you need to decide on the hard disk storage type. Again, you have two choices.

The default is dynamically expanding storage. This lets the VM machine start out with a small storage area on the physical hard drive and expand it as needed.

The other choice is a fixed-rate storage. This is a finite amount of storage stored. It consumes about the same size as the size of the virtual hard disk.

Be careful here. The creation of a fixed-sized storage can take a long time depending on the storage size and the write performance of your hard disk. You can click the Back button or Next button to continue.

Almost There

Next comes setting the Virtual Disk location and size. The Location window shows the name you already entered for the current VM session. You can change it or click the file manager-like button to select the desired location.

Next you must use the sliding scale to set the virtual hard disk size in megabytes. This size will be reported to the Guest OS as the maximum size of this hard disk. When ready, click the Back or Next buttons.

Carefully view the summary window. If you want to make changes, click the Back button. Otherwise, click the Finish button. A second summary window appears. Almost instantly, the VirtualBox Manager window replaces the previous summary window.

How to Use It

The first use setup is now finished. All that remains to be done is putting the guest OS in the virtual machine.

To start this installation, put the installation medium in the drive or find out its storage location on the physical hard drive. Then click the New button in the VM VirtualBox Manager window.

Follow the prompts. The process repeats the steps of the second wizard above.

To run the VM installation, click on the Guest OS label in the left panel of the VirtualBox Manager. The guest OS will run in its own self-contained sandbox window on the desktop. When you are finished, you can suspend the VM status to resume in a next session or quit that session.

Meanwhile, the host environment returns to full resource capabilities once you exit the Virtual application. This sure is better than having to interrupt what you are doing in the host Linux desktop so you can reboot/dualboot the system two or three times.

iCloud's Shadow on Security

iCloud's Shadow on Security
Add caption
This story was originally published on July 8, 2011, and is brought to you today as part of our Best of ECT News series.


Apple's (Nasdaq: AAPL) announcement of its upcoming iCloud service has sparked a flurry of excitement in the industry.

Some analysts expect the iCloud will help Apple keep customers closer to its bosom -- make them "stickier," in analystspeak. Others think the iCloud will give a boost to cloud computing.

The iCloud will automate the backup and storage of data -- music, photos and what-have-you -- and make it easy to set up new iDevices because everything on a user's old iDevice will be in the iCloud, and setting up the new one will simply be a matter of downloading that information.

iCloud will also automatically update all a user's content and information, including documents created, and sync them across the user's devices.

All of this sounds great for users, but its security implications are still uncertain.

With the increasing consumerization of the enterprise, iCloud has also raised questions regarding corporate security. As executives and workers use their iPads and iPhones in their daily work, is there a risk of corporate documents being accidentally exposed to outside eyes when devices are automatically synced on the iCloud?

Apple did not respond to requests for comment by press time.



Insecurity and the Cloud

"The advent of the cloud changes the focus of security onto the atomic components of individual machines and the data itself," Al Maslowski-Yerges, director of consulting services at En Pointe Technologies, told MacNewsWorld.

"Depending on the trust application for applications that use iCloud, it has the potential to raise the risk level significantly," Maslowski-Yerges said. "Without ways of securing data and the end points, it will be very hard to secure iCloud or any other similar service."

Apple's iCloud will accelerate the process of cloud services pouring into the corporate environment, Geoff Webb, product marketing director at Credant Technologies, told MacNewsWorld.

Hackers can launch attacks through any method of placing content, such as email, browsers, USB drives and removable media on an end user device, Maslowski-Yerges remarked.

The iCloud's automated sync and backup features will "probably be very attractive to attackers," Maslowski-Yerges concluded.

However, Gunter Ollmann, Damballa's vice president of research, contends that iCloud is currently unattractive to attackers because it's "essentially an online file storage system and access authorization service" that doesn't appear to allow anonymous access.

It's more likely that attackers will attempt to use iCloud to propagate malware among multiple devices belonging to a victim and "as a technique of remaining a persistent threat," Ollman said.

"I'm more worried about two other areas," Credant's Webb said. "One is the increasing risk of sensitive information leaking out of the business via iCloud and the second is the possibility of iCloud itself becoming a target."

The Danger of Devices

Apple's iTunes store was reportedly hacked recently, and victims' accounts were bilked for payments for Sega's "Kingdom Conquest" game. Sega denied responsibility, stating the game would only be charged to a victim's iTunes account if someone installed the app, logged into the victim's account with valid credentials, then made a purchase.

Perhaps the crooks simply guessed the passwords on victims' iDevices. iOS app developer Daniel Amitay found that 15 percent of iPhone owners used one of 10 four-digit numbers. These included "1234" (a sequence often preprogrammed into devices at the factory) "0000," "1111" and "2222."

Two other favorites were "0852" and "2580," both of which consist of selecting keys in one column of the iPhone's keypad.

A Transformational Model

By its very nature, the iCloud could constitute another threat to enterprise security.

The iCloud changes the threat model from a distributed one, in which hackers have to target individual users or companies, to a centralized model, Andrew Storms, director of security operations for nCircle, told MacNewsWorld.

In a centralized model, all hackers need to do is focus their efforts on breaking into a hub -- in this case, iCloud -- in the hope of gaining access to data on millions of customers, Storms explained.

The chances of such an attack succeeding are high because iCloud is a free service, Storms contended. As such, it's reasonable to assume that almost every Apple user will take advantage of the iCloud, he said.

However, the real issue is Apple's lack of transparency around their security methodologies, Storms remarked.

"That means every iCloud and potential iCloud user is left to guessing about how safe their data will be," Storms explained.

"Enterprise security teams, in particular, do not want to guess," Storms said. "They want solid information so they can build it into the risk models they use to protect the business."

Content Consumption Can Hurt

The management and control of confidential data and intellectual property are among the most difficult aspects of enterprise IT security, Storms said.

The iCloud's autosyncing feature may possibly expose sensitive corporate information to unauthorized people.

"With iCloud being free and very easy to use, every iPhone and iPad user in your company will very probably be syncing their devices to the iCloud, and Apple hasn't provided any tools the enterprise can use to control what kind of data can be stored in the iCloud," Storms pointed out.

"The biggest worry has got to be around the document sync feature," Credant's Webb remarked.

The School of Gaming

The School of Gaming
These are the two generally accepted approaches to thinking about games. Though not incompatible, these two branches of knowledge nonetheless contend for pre-eminence among video game designer priorities.

The first emphasizes play, the second story. In literary theory, narratology is the study of narrative structure -- it looks to amuse, instruct or entertain, and so is designed for us to take in. Ludology is rooted in the Latin word "ludus," meaning "game." It is the academic study of games, particularly video games, it is about participation, and it is huge.

Story you watch. Play you do.

Game designers must understand and resolve the tension between these contending forces if they're to create successful games. The choices they make and the preferences they make manifest in their creations affect gameplay for millions of people the world over.

More than 100 colleges and universities in North America -- up from fewer than a dozen five years ago -- now offer some form of "video game studies," ranging from hard-core computer science to prepare students for game-making careers to critiques of games as cultural artifacts. Game studies is largely a multi- and inter-disciplinary field with researchers and academics from a multitude of other areas such as computer science, psychology, sociology, anthropology, arts and literature, media studies, communication and more. Topics range from game philology to the study of virtual economies in "EverQuest."



The Epicenter of Game Design Studies

If California is Video Game Central in the U.S., the University of Southern California (USC) is Ground Zero for incubating game designer talent.

In January, the Trojans won -- for the second year in a row -- the top two prizes from GamePro Media and The Princeton Review in the "Top Schools for Video Game Design Study for 2011." Some of the schools and departments behind this accomplishment include:

USC Interactive Media Division & Computer Science (Game Development)
USC School of Cinematic Arts
USC GamePipe Laboratory
USC Andrew and Erna Viterbi School of Engineering (computer science major with a concentration in game development)
Michael Zyda is the director of the USC GamePipe Laboratory and a professor of engineering practice in the USC Department of Computer Science. At USC, he created the BS in Computer Science (Games) and the MS in Computer Science (Game Development) cross-disciplinary degree programs.

"Designers who understand the history of play and the structure and application of play are worth their weight in gold," he told TechNewsWorld. "Story is also nice, but play is key ... that and understanding what people do that makes them play and keep playing."

Tracy Fullerton is director of the USC Electronic Arts (Nasdaq: ERTS) Game Innovation Lab and an associate professor.

"There are a number of educational challenges in preparing students to enter the game industry," she told TechNewsWorld. "These include teaching core skills in procedural literacy and creative expression, but also acknowledging that the industry is now in a time of shifting business models and technology. This can be confusing, but it also provides a lot of opportunity for new talent just entering the workforce."

As for play vs. story: "Our students practice creating compelling gameplay and engaging stories," Fullerton said. "They learn to balance story and game, to integrate worlds, characters and play in ways that allow for rich, transmedia experiences."

Yannis Yortsos is dean of USC's Viterbi School, and he sees the educational issues as key.

"Teaching collaboration between programmers, designers and artists is an important challenge," he told TechNewsWorld. "People with these three gifts each have different ways of operating. Getting them all to operate together so they can become good at this is actually the biggest success we have as a program."

USC alumnus Artem Kovalovs is gameplay programmer at Visceral Games, part of Electronic Arts (EA). Kovalovs is building a 3D video game engine from scratch in C++. He began building the PRIME Engine 3D game framework for PC and XBox 360 with help from other students while at USC. The world's first 3D browser-based MMO (massively multiplayer online) engine, PRIME provides a complete suite of tools and technology that allows developers to create diverse 3D worlds and have them populated by thousands of players interacting with each other in real time.

The biggest programming challenge facing the industry today, according to Kovalovs, is keeping designers from "getting lost" in the huge code base of any game they're designing.

"We're talking about thousands of files, which can become overwhelming," he told TechNewsWorld. "I tell young designers to get started fast and keep going. Make the work modular so that you're doing smaller tasks, step by step. That way you don't get discouraged and think about stopping -- always keep going forward."

The Game Lab

Fifty miles to the south sits another booming schoolhouse for video game design studies -- the University of California at Irvine (UCI).

Dan Frost is a lecturer in the informatics department of the Donald Bren School of Information and Computer Sciences. Young designers, according to Frost, find the ludology/narratology framework to be "tired, unnecessarily binary and of little use when making design decisions."

Explained Frost: "I see them taking a more visceral approach. Is it fun? Is it a new twist on an old mechanic? Will it draw me in and keep me playing?"

One of the biggest challenges game designers face is "the tension between depth and breadth," he said.

"Companies often want to hire people with expertise in a narrow area, and students sometimes view their education through the prism of finding a first job," Frost told TechNewsWorld. "But universities have an obligation to educate students broadly, both to help them discover where they'll want to specialize in the future, and to prepare them for the jobs they will have in five, 10 or 15 years"

Multiplayer gaming, virtual worlds, alternative game genres, and games and gender are all specialties of Celia Pearce, a game designer, author, researcher, teacher, curator and artist. She currently is assistant professor of digital media in the School of Literature, Communication and Culture at Georgia Tech, where she also directs the Experimental Game Lab and the Emergent Game Group. She is the author or co-author of numerous papers and book chapters, as well as The Interactive Book and Communities of Play: Emergent Cultures in Multiplayer Games and Virtual Worlds

The Insane Month of August: So Long, Farewell, Auf Wiedersehen, Goodbye!

The Insane Month of August: So Long, Farewell, Auf Wiedersehen, Goodbye!



We have crazy months from time to time, but August will likely go down in history as one of the biggest tech news months of any year. From the torpedoing of Android by Google (Nasdaq: GOOG), to the off-again, on-again TouchPad sales, to the departure of Steve Jobs, to the slashing of Oracle's (Nasdaq: ORCL) US$1.3 billion settlement, to the...
Well I'll get to all this in a moment, and I'm sure we are all glad to look back at the insane month of August.

I'll end with my product of the week: the first five-door hatchback that I might actually be tempted to buy, the Audi A7.



HP's Do-Over Month

August was the month that I think HP (NYSE: HPQ) would like to do over. Pretty much every time I turned around, I was trying to explain something it did. It may have had good reasons, but with its valuation down sharply, its execution left a bit to be desired.

The biggest was its getting out of, er selling, er spinning out, the PC business. Now to be clear, nothing will be changing at all in the next 12 to 18 months, and then you'll only know the plan -- which will likely take three to 24 months to execute (depending on the details). So there should have be no news here, certainly nothing actionable.

Yet by announcing it was thinking of doing something big, it scared the crap out of buyers (both consumers and businesses) and investors, and put a smile on the face of Michael Dell (Nasdaq: DELL) and his peers at the other PC firms. The reason it did this was that the plan to develop a plan had leaked out, and given this would be clearly material, it needed to avoid an SEC event. Boy, if there was ever a firm that should borrow Apple's (Nasdaq: AAPL) "loose lips sink ships" posters, it is HP.

That alone would have been enough to cause HP to stand out, but it also discontinued its TouchPad and then un-discontinued it so it could discontinue it again.

Now HP made a ton of mistakes bringing the TouchPad to market, from the name it chose to the method used for the merger with Palm (more than 80 percent of mergers like this are unsuccessful).

Then Best Buy (NYSE: BBY) threatened to send a ton of unsold TouchPads back, and HP pulled the plug -- this despite the fact that most reviewers I know placed this crippled product right behind the iPad, suggesting the second generation (which now won't exist) could have kicked some Apple butt.

Then it un-pulled the plug because pulling the plug created too many short-term problems. The whole thing had some folks even questioning whether HP could run a data center.

Who would have thought that selling more tablets than Apple did iPads in a given period would be a bad thing?

Oracle Out of Luck

HP's mortal enemy Oracle had a very strange month as well. There was the filing it made in the Oracle/HP litigation that must have been made to make it look like Oracle's attorneys were doing something to justify their hourly charges.

They maintained that Oracle wouldn't have settled with HP on Hurd's hiring if Oracle had known HP was going to hire on old (like more than a decade old) Oracle CEO as chairman and an ex-SAP (Oracle competitor) CEO as the new HP CEO. These things were so not related, it put a new meaning on the term "throwing sh*t against the wall." Seriously Oracle, no one was planning on pulling the plug on Itanium; this won't fix that.

Adding to this impression is the latest news that Oracle had its $1.3B judgment against SAP (NYSE: SAP) overturned. That's not chump change, and the attached pleadings would indicate that Oracle didn't meet its burden. It is really tough to lose on appeal like this. It is great to see a legal team really step up.

OK, I shouldn't kid about this because Larry is likely running around shooting attorneys this week. Wonder if he wants any help?

Speaking of help, Oracle fell under bribery investigation. Now I used to be an internal auditor, and bribery charges like this are a bit of an inside joke. You see, to do business in some countries, you have to use bribery. The only other choice is to exit the country, and no one -- including the U.S. government -- wants to you to do that.

On top of that, the cause is that the politicians and bureaucrats who require the bribes are too powerful to touch. So this is kind of a Russian roulette tax; virtually no one goes to jail -- instead you pay a hefty fine and then are allowed to go on your way. You do this hoping the next company caught won't be you (the Russian roulette part). In effect, it just became Oracle's turn to be punished for doing something pretty much everyone agrees they have to do.

Google Buys Motorola

Boy, after Microsoft's (Nasdaq: MSFT) experience with the Zune, Apple's with licensing MacOS, and IBM's (NYSE: IBM) with OS/2, if there's one hard rule in tech, it's that doing hardware while trying to license software is a losing battle. No, that isn't right -- it is typically a multibillion-dollar disaster. So what does Google -- the company that hasn't met a Microsoft mistake it doesn't repeat -- do? It buys Motorola and announces "Android Will Stay Open." Which in tech speak means it probably won't be open much longer.

Suddenly Samsung -- the company that hasn't met an OS it doesn't like -- is looking to buy WebOS from HP. Boy, if there was ever a platform that didn't need any more drama, it's Android.

August also saw the Samsung Galaxy Tab get blocked and unblocked in a running battle across Europe, which also likely contributed to Samsung's interest in a less litigation-prone platform. One accountant argued Google did it for the tax breaks. Sure it did.

Steve Jobs Leaves Apple

I've said plenty about this before (The Day the Magic Died), but Jobs is not only the glue that holds Apple together. It is largely his design influence that is keeping us from going back to cheap white box computers and largely uninteresting smartphones.

His exit as CEO did have a bunch of us looking back and talking about different aspects of the Jobs' experience. One of the best was this piece on why Apple thought tablets were stupid. I did point out that one of our big problems as an industry is that under current hiring practices, Google, Microsoft and even Apple wouldn't ever hire a young Steve Jobs.

And that is ending us on a sad note. But I think HP, Oracle, Google and Apple folks can all smile about one thing. August is over.

Researchers Rev Up Electric Nano-Motors

Researchers Rev Up Electric Nano-Motors
Add caption

Researchers at Tufts University announced Sunday they've created an electrical motor many thousands of times smaller than the width of a single human hair, a breakthrough they claim could eventually lead to innovations in healthcare and technology.

The microscopic motor is the size of a single molecule and is electrically charged, an innovative feat since previous single-molecule-sized motors were powered by chemicals or light.

The distinction is important. With a light or chemically charged motor, scientists struggle with precision in adding chemicals to a clump of trillions of molecules. In that scenario, the practicality of charging the motors decreases.

With the Tufts research, though, a team led by Associate Professor in Chemistry Charles Sykes found that electricity could be used with precision and accuracy to control a single molecule, even when additional molecules sat just a nanometer away.

The team says the ability to control a single-molecule motor could lead to innovations in more precise medical and engineering technology and could lead to tinier, advanced digital devices.

"An interesting thing would be to get this into opto electronics, where you are interfacing light with electronics. With the tiny chargers on the motors, you have a tiny rotating motor and you could create something that would give off light, or a tiny microwave generator or tiny antenna," Sykes told TechNewsWorld.



Big Technological Steps

The research, published in a recent edition of Nature Nanotechnology, was made possible in thanks to huge advances in research technology such as the scanning tunneling microscope, which uses electrons instead of light.

The scientists sent an electrical current through a butyl methyl sulfide molecule using the tip of the microscope as the molecule rested on a copper surface. In that position, a single atom worked as a pivot.

From there, the scientists could measure the molecule as it spun to prove the movements were directed by the electrical charge.

Long Way to Go

While the implications could be far-reaching in the technological field, Sykes and other chemistry experts say applications are still a long way off.

"It's definitely fundamental work right now," said Skyes.

One reason for that is the extreme temperature needed to measure the molecules as they spin. Since higher temperatures increase the rate at which the motors charge, the scientists had to keep the molecules in a chilly environment.

Even at 100 Kelvin, or about negative 279 degrees Fahrenheit, the molecules spin at an astonishing million spins per second. To have the capacity to measure what was going on, the research team had to keep temperatures closer to 5 degrees Kelvin, or about negative 450 degrees Fahrenheit

To apply the information found in the study to practical applications in medical or engineering fields, scientists must figure out how to operate at easier-to-attain temperatures, an issue Sykes believes is conquerable.

"I think we just have to be more clever with the chemistry of the motors to have different molecules and stronger bonds," said Sykes.

However, different problems may exist with the practicality of this newly revealed research.

"Low temperature is not the only serious problem for applications of the phenomenon," Alex Vologodskii, research scholar of chemistry at New York University, told TechNewsWorld.

If advances in technology do come through, Sykes envisions the motors being used in medical devices, perhaps to pump fluid through the pipes to better pinpoint a spot where medicine is headed or to sense the local environment for a clearer diagnosis. Those applications, though, are far from fruition.

Solyndra plans bankruptcy filing in blow to U.S. solar industry

The uncertainty around the future of U.S. support for renewable energy technology investments has taken a toll. U.S. solar technology maker Solyndra has confirmed that it plans to file for Chapter 11 bankruptcy protection, laying off some 1,100 employees in the process.
The news is doubly unsettled considering the fact that Solyndra has snagged some pretty high profile commercial installations, including the Qwest field in Seattle, AND it had arranged a massive $535 million loan guarantee program with the U.S. Department of Energy. This collapse will be scrutinized up and down for sure, and likely will emerge as an embarrassing campaign issue for President Barack Obama, who visited Solyndra. In mid-July, House Republicans called for close scrutiny into how Solyndra was picked for the loan guarantee, especially since the company’s IPO was pulled right after the deal was announced.

The Fremont, Calif.-based company publicly pinned its troubles to several factors, including its inability to ramp production as quickly as “larger foreign manufacturers,” price compression and vanishing credit for financing solar systems.

In a statement, Solyndra President and CEO Brian Harris said:

“We are incredibly proud of our employees, and we would like to thank our investors, channel partners, customers and suppliers, for the years of support that allowed us to bring our innovative technology to market. Distributed rooftop solar power makes sense, and our customers clearly recognize the advantages of Solydra systems. Regulatory and policy uncertainties in recent months created significant near-term excess supply and price erosion. raising incremental cash in this environment was not possible. This was an unexpected outcome and is most unfortunate.”

Apple criticized for questionable green-tech policies in China

About a year ago, a group called Pacific Environment came out publicly to criticize Apple’s use of certain manufacturing organizations in China that it believes use questionable environmental and public health practices. This month, there is another report out by several non-governmental organizations suggesting that the giant technology company has done little to address those concerns, at least publicly. And that it should do more.

That report, “The Other Side of Apple II: Pollution Spreads Through Apple’s Supply Chain,” is signed by five groups including Friends of Nature, the Institute of Public & Environmental Affairs, Green Beagle, Envirofriends, and the Green Stone Environmental Action Network. I don’t really know much about any of them, to be honest. The report details their investigations into Apple suppliers that have known violations in both pollution emissions and improper disposal of toxic substances. One of the suppliers, Meiko Electronics in Guangzhou, has been penalized for more than 10 different violations, according to the report. The report notes:

“The large volume of discharge in Apple’s supply chain greatly endangers the public’s health and safety. Through the process of our investigations, we discovered several suspected suppliers to Apple that have been the target of numerous complaints from local communities.”

Mind you, there is one word in that previous paragraph that really caught my attention, “suspected.”

It leads me to wonder whether these organizations can really say with authority that the manufacturing organizations and sites that it is targeting are actually working with Apple, or whether they are making an educated guess based on Apple’s voluminous supply chain. No one really knows, and that is part of the problem.

The thing is, Apple does publish supplier responsibility reports and it has been publicly on top of this issue. This is not the first time, however, that the company has been accused of being less than forthcoming about its green credentials. In this specific report, the NGOs say “Apple has systematically failed to respond to all queries regarding their supply chain environmental violations.”

To be fair, Apple probably doesn’t have the sort of the down-deep visibility that these NGOs are demanding that it have into its supply chain partners. It reminds me of a situation that several of the big apparel makers — Nike, Adidas and PUMA — are facing with respect to water management and toxic chemicals discharge policies at some of their own suppliers.

The fact is, these big companies don’t always have detailed view into their suppliers’ business dealing that some environmental groups are beginning to demand — especially when it comes to Chinese business partners. That’s a problem that more shareholders of public companies are starting to scrutinize.

The report suggests that Apple is a special case. The organizations note: “Even when faced with specific allegations regarding its suppliers, the company refuses to provide answers and continues to state that ‘it is our long-term policy not to disclose supplier information.’ A large number of IT supplier violation records have already been publicized; however Apple chooses not to face such information and continues to use these companies as suppliers. This can only be seen as deliberate refusal of responsibility.”

Based on Apple’s long-standing tradition of non-commenting on pretty much any news that it hasn’t tightly controlled, I’m not sure I share the same assessment of the reasons that the company has stayed silent on this particular issue. I think it is being mum just because that is its normal tactic when dealing with unpleasant or uncontrolled news.

In this instance, Apple spokeswoman Carolyn Wu told the Reuters news organization:

“Apple is committed to driving the highest standards of social responsibility throughout our supply base. We require our suppliers provide safe working conditions, treat workers with dignity and respect, and use environmentally responsible manufacturing processes wherever Apple products are made.”

Wind technology game-changer snags another $15 million

Making wind turbine technology more efficient — at any wind speed — is the holy grail for the industry. So it is easy to see why Danotek, a start-up that uses magnets rather than traditional generations to harness wind power, has captured the attention of four prominent cleantech investors. That group — Khosla Ventures, CMEA Capital, GE Energy Financial Services and Statoil Technology — have put up $15 million to help Danotek scale up its production. The new gust of funding brings Danotek’s total money raised so far to $41 million.

According to the investors, the reason that Danotek’s technology is interesting is because it is lighter than traditional options, potentially can last longer because it has no moving parts, and can generate energy in even low wind conditions. The company already has orders of $50 million for the technology, which could help save up to $1 million per turbine over its lifespan on a typical wind farm.

Said Danotek President and CEO Don Naab:

“We are already contracted with some of the wind industry’s leading turbine manufacturers, with our first systems going up-tower later this year, and we’re engaged in multiple negotiations with several other globally recognized turbine manufacturers. Danotek is on track and well positioned for success.”

Related wind technology stories:

5 small wind technology players angling for mainstream attention
Wind power gets big vote of confidence (and some dough) from Google