Thursday, August 25, 2011

Mobile Tech » Sony Looks Into the Mirror to Boost Its DSLR Cred

Posted by echa 5:26 AM, under | No comments

Sony Looks Into the Mirror to Boost Its DSLR Cred | Sony Sony's new line of SLT digital cameras pack on the megapixels. But it might be the mirrors that set the DSLRs apart from the pack. The cameras use translucent mirror technology and electronic viewfinders. "We do not expect Canon or Nikon to adopt these any time soon, as they're more conservative in this aspect of product design," said IDC's Chris Chute.

Sony (NYSE: SNE) has announced two new additions to its SLT-A family of cameras: the A77 and A65.

They offer 24.3MP effective resolution and have what Sony says is the world's first XGA OLED electronic viewfinder.

Both cameras use the translucent mirror technology common to Sony's SLT-A family and offer progressive full HD video recording.

"No other DSLR offers full HD recording in 60p, 50p, 25p, and 24p," digital photographer and film director Preston Kanak told TechNewsWorld.

Sony Looks Into the Mirror to Boost Its DSLR CredSony is unique in using translucent mirrors, remarked Chris Chute, a research manager at IDC.

"We do not expect Canon (NYSE: CAJ) or Nikon to adopt these any time soon, as they're more conservative in this aspect of product design," Chute told TechNewsWorld.

Sony did not respond to requests for comment by press time.

Tech Specs for the SLT-A Family Additions

The SLT-A77 can capture full-resolution images in bursts at 12 frames per second with full-time phase-detection autofocus. The A65's speed for this is 10 frames a second.

Both cameras use cross sensors with multipoint autofocus systems for precision tracking. The A77 has a 19-point autofocus system with 11 cross sensors, and the A65 a 15-point autofocus system with three cross-sensors.

This lets cameras maintain their focus lock on a designated moving object even if another object blocks it temporarily from view.

"It's not so much the sensors, but the logic that connects them," Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld.

"The result is supposed to be better focus on complex subjects with subject material dispersed both front and back, and side to side, and the resolution's very high, which should allow for better editing," Enderle elaborated.

The new cameras use Sony's newly developed Exmor APS HD CMOS sensor, which gives them that effective resolution of 24.3MP.

Exmor sensors combine the speed of CMOS sensors with advanced-quality image sensor technologies to provide enhanced resolution for more detailed images.

"With massive improvements in editing tools, the quantity of good data is more important than the quality of any one shot," Enderle said.

The Exmor sensors are teamed with the latest version of Sony's BIONZ image processing engine. This speeds up the conversion of raw image data from the Exmor sensors into the format stored on the camera's memory card.

Sony uses the BIONZ engine in several of its cameras, including those in the DSC family. This video explains the advantages of BIONZ.

Both cameras use what Sony says is the world's first XGA OLED Tru-Finder. This electronic viewfinder has an XGA resolution of 2,360 dots and offers a high-contrast image with 100 percent frame coverage, Sony claims.

The cameras have a Smart Teleconverter feature that lets users see compositions on the Tru-Finder so they don't have to look away from the viewfinder, and capture them as 12MP images.

"One thing that's great about the technology is the new OLED," Kanak said. "We couldn't use the viewfinder to see our compositions with previous technology."

The A77 has a three-way adjustable screen that Sony claims is another world's first.

Other technical details are available here).

Mirror, Mirror on the Wall

The latest developments in Sony's translucent mirror technology make the A77 and A65 the quickest and most responsive interchangeable lens cameras in their class, the company claims.

"This is a brilliant move by Sony to use translucent mirror technology this way, providing weight and speed advantages to its cameras," Enderle said.

Translucent mirror technology replaces optical pentaprisms used in DSLRs with electronic viewfinders, making digital cameras smaller and lighter. Introduced in the 1960s, it was used in specialized high-speed cameras such as the Canon Pelix QL but it was too expensive for mainstream use, Enderle stated.

Competitors to the A77 and A65 are the Panasonic Lumix and the Canon 60D and 7D, Kanak said.

The question now is whether consumers will bite.

"With the A77, Sony is now offering cameras that compete quite effectively against cameras such as the Canon 7D," Carl Howe, director of anywhere consumer research at the Yankee Group, told TechNewsWorld.

"Its only challenge is convincing high-end buyers to switch brands," Howe added.

Mobile Tech » iPhone Could Bring Agony and Ecstasy to Sprint

Posted by echa 5:19 AM, under | No comments

iPhone Could Bring Agony and Ecstasy to Sprint | iPhone A report this week that Sprint will soon offer the Apple iPhone on its network has excited the carrier's investors -- Sprint's stock rose about 10 percent on the news. Offering the phone would allow the carrier to better compete with its rivals' device portfolios; however, questions have arisen regarding network strain and how Sprint will handle factors like the phone's high subsidy cost.

It looks like Sprint (NYSE: S) is about to join Club iPhone.

The wireless network could become the Apple (Nasdaq: AAPL) smartphone's third U.S. wireless carrier in October, just in time for the holiday season, according to a recent Wall Street Journal report. That's also the time frame in which Apple is expected to release the fifth generation of iPhone, though the company has not officially announced a date.

Sprint's stock rose 10 percent on the news Tuesday.

AT&T (NYSE: T) offered the iPhone exclusively in the U.S. from the device's introduction in 2007 until Apple added Verizon earlier this year.

Despite Sprint's stock gain, though, the news has prompted some doubts about the value of the deal for Sprint. The iPhone reportedly comes with a subsidy price tag that is larger than other smartphones. Those subsidies could cut into Sprint's margins.

On the plus side, the addition of the iPhone would put Sprint closer to an equal footing with AT&T and Verizon, much larger competitors. Verizon has roughly 106 million subscribers. AT&T has an estimated 99 million, and Sprint has 52 million. For Apple, the move is an unambiguous plus.

Sprint declined to speak with MacNewsWorld, noting the company does not comment on speculation. Apple did not respond to a MacNewsWorld's request for comments by press time.

Good for Apple - So-So for Sprint

Question have arisen regarding whether the iPhone will put strain on Sprint's network. There has also been speculation that Sprint will have to drop its unlimited data plan to protect its network.

"I don't think it's going to kill the network," Kevin Burden, VP of mobile device research at ABI Research, told MacNewsWorld. "We're at a point now where most mobile operating networks are tuned to handle volume without bringing things down. Maybe four or five years ago it would have been a problem."

The deal will most likely happen, Burden said, and he believes it will be a strong subscriber grab for Apple. "Apple needs to get as many subscribers as it can," said Burden. "They're looking at strong competition from Androids and the upcoming BlackBerry 7. They don't have the commanding lead anymore. The phones are all getting really similar. In the U.S. market, the last big thing they can do is get Sprint."

As for the question about whether the "Apple tax" in high subsidy costs will hurt Sprint, Burden believes Sprint is accustomed to subsidies. "The subsidies won't kill Sprint. It's not like they're not used to subsidies," said Burden. "It's not a new business model."

The timing of the deal is likely scheduled for October, when Apple is widely expected to release its next iPhone version to coincide with the beginning of the holiday season.

"The introduction of iPhone 5 seems like the logical moment for Sprint to launch iPhones," said Burden. "Coming out with the iPhone 4 would be silly. It wouldn't be a big deal. So you can predict when it will have it -- they'll have it with the introduction of the iPhone 5."

The Phone for the Big League

Adding the iPhone clearly ups the ante for Sprint, giving it equal footing with AT&T and Verizon in its line of devices.

"Sprint talks about how important the device portfolio is," William A. Stofega, program director, mobile device technology and trends at IDC, told MacNewsWorld. "Sprint has had some decent devices. They've done well with HTC, which runs on Android. Even so, Sprint would like to have the iPhone. Questions about the iPhone come up on their earnings calls all the time. Having the iPhone shows they're a player. In overall numbers, they may not be in the same league as AT&T and Verizon, but they're sill a player."

On the other hand, after its second-quarter earnings report, Sprint was criticized for the high costs associated with adding and retaining customers. Adding the iPhone ratchets up those expenses. "The iPhone subsidy costs hurt margins even with Verizon," said Stofega. "The iPhone is that special case all by itself. The carrier has to subsidize at a higher rate to drive and retain customers."

A Sprint move would also come at a time when Android devices are getting the cool-factor nod from consumers.

"The world is changing. Droid is getting mindshare. It's a juggernaut," said Stofega. "It can't be slowed down by anything. The Motorola acquisition could help. There has been a need for an Android platform that shows off its full capabilities and is much more integrated between hardware and software. Motorola could make that happen."

IT Management » Federal CIO Prods Agencies to Revamp Websites

Posted by echa 5:15 AM, under | No comments

Federal CIO Prods Agencies to Revamp Websites | Federal CIO The website reform plan is part of a larger federal program to improve customer relations with the public through various means, including the use of innovative technologies. "The federal government has a responsibility to streamline and make more efficient its service delivery to better serve the public," President Barack Obama said in kicking off the customer service initiative earlier this year.

The U.S. government's new chief information officer wasted little time in directing federal agencies to significantly improve the way they manage information technology resources. Steven VanRoekel, who took over the federal CIO post on Aug. 4, quickly issued a directive designed to push agencies to meet an Obama administration goal for operating federal websites more efficiently.

While VanRoekel's directive was actually mandated by an Office of Management and Budget (OMB) memo issued last June, the tight deadline he set for compliance by federal agencies indicates the importance he attaches to the administration's federal online reform initiative.

The goal of the federal website reform program is "to improve online services and eliminate wasteful spending by developing a comprehensive and consistent strategy for efficiently managing web resources and assuring that valuable content is readily accessible and available online," VanRoekel says in the directive.

The first step in the program is an OMB order "freezing" the creation of new federal websites. VanRoekel extended the freeze through Dec. 31 "to reinforce the importance of curtailing the proliferation of stand-alone government web sites and infrastructure."

Targeting 1,000 Domains

The U.S. government maintains 2,000 Web addresses (URLs) that support 24,000 sites. The reform initiative sets a goal of reducing the URLs by half by mid-year 2012.

In his directive, VanRoekel sets some ambitious targets for meeting the website reform goal. Agencies must either reduce or redirect to existing sites the number of federal executive branch domains by 25 percent by the end of September. Also, by Sept. 6, agencies must provide an interim progress report with a list of government website domains that are outdated, redundant or underperforming, as well as a list of "redirects" that no longer provide value, or domains that are nonfunctioning and no longer in use.

By Oct. 11, federal agencies must provide a report with an inventory and analysis of all registered dot-gov domain names and an assessment of "web governance." The report must include a list of candidate sites for merger or elimination, a list of sites that provide high value to the public and can serve as models, and a plan for managing websites more efficiently.

A panel of government IT specialists known as the ".Gov Reform Task Force" has been established to assist agencies in meeting the Wweb improvement program goals.

The website reform plan is part of a larger federal program to improve customer relations with the public through various means, including the use of innovative technologies.

"The federal government has a responsibility to streamline and make more efficient its service delivery to better serve the public," President Barack Obama said in kicking off the customer service initiative earlier this year.

The General Services Administration (GSA), which has taken a lead role in improving the use of information technology at the federal level, is taking an active part in the website reform program, partly through its support of the Federal Web Managers Council.

"The Council is very supportive of the website reform effort. We've been calling for agencies to clean up the clutter for many years now," Rachel Flagg, deputy director of the Center for Excellence in Digital Government at GSA, told CRM Buyer. Flagg also serves as a cochair of the Council.

"Streamlining and consolidating websites ... will help the public to more easily find the information and services they need and enable agencies to more effectively manage their online information. It should also save money, if agencies take advantage of their existing Web infrastructure to host their content instead of creating new, stand-alone sites which often require additional servers, content management systems and design services," Flagg said.

GSA already has adopted some of the reforms on its own Web sites. The agency recently deleted redirects for dot-gov domains that yielded fewer than 300 referrals annually. However, GSA kept redirects such as firstgov.gov, a legacy domain that refers hundreds of thousands of users annually to the current domain, USA.gov.

Public and Private Sector Lessons

"Our definition of transparency is that information should be easy to find, it should be available quickly, and it should be easily accessible," Larry Freed, president and CEO of ForeSee, told CRM Buyer. "If anything, the larger aims of the government reform task force will greatly improve transparency because it will address those three core issues." ForeSee is a customer experience analytics firm that works with more than 200 federal government websites.

"This task force won't get us all of the way there, but it is a definite move in the right direction," Freed said.

While the goal of the reform effort is to improve communication, the initiative does pose some challenges.

"The temporary freeze on the issue of new dot-gov domains is unlikely to have much effect on transparency or public access to information," said Daniel Schuman, director of the Advisory Committee on Transparency for the Sunlight Foundation.

"There's a bigger concern with the consolidation of domains, if the consolidation means that information is taken off the Web or becomes more difficult to find," he told CRM Buyer.

"It's generally not the number of websites, but rather their internal structure and the way information is described that's confusing to the public," Schulman explained. "The government needs to make more use of websites like data.gov to share data sets and other types of information that power private sector innovation. The public needs to be empowered to reuse information gathered by the government. All too often, government information is made publicly available in difficult-to-use formats."

The program will not only improve customer contact mechanisms, but also should provide substantial savings.

"Stand-alone domains can cost hundreds of thousands or millions of dollars to maintain," notes VanRoekel's Aug. 4 reform memo. "Although migrating a stand-alone web site to another domain requires an initial investment, consolidating infrastructure and operations can provide significant long-term cost savings."

The website improvement panel is composed of federal Web managers, but the group is open to help from the private sector.

"The Task Force has, in fact, already consulted with some industry experts," said Flagg, "and there are plans to open a public dialogue on this topic very soon. We welcome input and feedback from both industry experts and the general public."

Internet » New Facebook Privacy Tweaks Have a Googley Aftertaste

Posted by echa 5:11 AM, under | No comments

New Facebook Privacy Tweaks Have a Googley Aftertaste | New Facebook Facebook has given its users a new set of privacy controls, allowing them greater powers to select who sees what and approve tagged posts. The new features appear somewhat similar to those found in Google+, the social network that could prove to become a major Facebook rival. Facebook's changes have received a nod of approval from some privacy advocates.

In a nod to users who have complained about Facebook's privacy settings for years, the social network announced new, simplified settings Tuesday that allow users to exercise greater control over what information is shared across the network.

Going forward, users can choose a feature called "Profile Tag Review," which would allow them to approve a photo or post in which they're tagged before it hits their profile, or they could simply remove the tag.

The upgrades also make it easier to share tagged photos or posts with specific individuals or groups, much like the Circles feature in rival network Google+.

Those controls, and other privacy settings such as the option to see how a particular individual views your profile, will now appear in a drop-down menu next to the photos and posts for easier access to the security features.

Although users now have the option to refuse a tag, they may have to do so more often -- Facebook also announced that users can now tag anyone, even non-friends, in photos or posts. Early critics worried that option could be used in unintended ways, such as by advertisers or spammers looking for a new way to recruit customers, but Facebook doesn't think that's a concern.

"Something to note is that whenever you're tagged by a non-friend, it will always go into your Pending Posts section of your profile [regardless of whether you've turned on the Profile Tag Review or not]" Meredith Chin, product communications spokesperson at Facebook, told TechNewsWorld.

Facebook also expanded the location-based technology aspect of the site. Now, users will have the option to tag themselves from anywhere, not just a mobile device.

The changes will begin gradually rolling out on Thursday, and once it hits one's profile, the user can be guided through a tour to get a better feel for the updates.

Privacy Report Card

Due to the number of complaints and public relations headaches Facebook's privacy policies have caused it in the past, the company worked with technology privacy advocates to make sure the new settings would receive a warm welcome, and so far they seem to have made a positive impression.

"On the big picture we think these changes look very good. Facebook has been working to develop these for a while and made a real effort to make sure these are intuitive changes for users, that users understand how they work and don't accidentally overshare," Erica Newland, policy analyst at the Center for Democracy & Technology told TechNewsWorld.

One initial concern was the new ability to tag non-friends in photos. Facebook touted the feature as helpful when tagging a photo of a group of co-workers, for example, or acquaintances who may not necessarily be Facebook friends, though there was concern it would become just another way for spammers to work their way into the ecosystem. Since those photos or posts must be approved, though, the user is given a measure of control.

"I think there is some sense to allowing people to tag non-friends. It's Facebook's decision on how to optimize that experience for users, but it's important they're giving users the option to exercise control. That's something users have asked for a while and it's absolutely a step in the right direction," said Newland.

It's a direction many social networks are taking. Since the lines between what is appropriate to share online blur between generations, professions and lifestyles, networks are leaving it up to users to decide just how much of their info they want out there.

"We're happy Facebook is creating a forced choice. That's a very good model for privacy controls, rather than assuming you know what the user wants," said Newland.

Pressure From Plus?

Facebook's new controls are entering the scene around the same time as Google+, the search engine's attempt at a competitor to challenge Mark Zuckerberg's far-and-away leader in social networking 6 Ways to Use Social Media for Business. Free Guide..

After its debut in July, Google+ saw an unprecedented, almost immediate surge of users, and there was speculation it was because of the network's more personalized, controlled sense of privacy and security. In Google+, contacts are divided more naturally into groups, or what the site calls "Circles." Users choose from the onset who is a friend, family member, or co-worker, for instance, and with each post or photo must decide with which Circle they'll share.

The concept is similar to Facebook's changes, but the social network leader says it wasn't modeled after anything in particular.

"We've been working on these changes for the last several months. We're excited to be introducing a lot of changes that people have been requesting," Chin told TechNewsWorld.

The bigger question is not if this was a competitive response, but if all networks treat user information with the concern it deserves as online sharing becomes an inevitable part of the social scene.

"I can't speculate on how the two may have been connected, but what is really clear is that social networks see that privacy is a value for users. In order to attract and retain users, they have to offer controls. It's kind of a maturing of the social networking ecosystem," said Newland.

Computing » It's a Roll of the Dice for Linux Game Makers

Posted by echa 5:08 AM, under | No comments

It's a Roll of the Dice for Linux Game Makers | Linux Game Makers The big problem with Linux users is their aversion to paying for anything, said tech analyst Rob Enderle -- so for Humble Bumble's developers to get customers to voluntarily pay for Linux games is in itself pretty amazing. ... The Linux derivative OS, Android, might well be the platform for change when it comes to Linux gamers parting with their cash.

If you had the option to pick your own price for a computer game that only runs on your Linux rig, would you pay to play? Not if you are a typical Linux gamer. At least, that's the popular perception of fans of free and open source software. Linux is available freely. So why pay for a game -- or any other Linux app -- when the FOSS mantra is based on a no-cost buy-in?

The team behind the Humble Bundle set of computer games is trying to buck the notion that Linux users are cheapskates. That company allowed its customers to name their own prices to purchase and download their software. Then they parsed the results by what OS the downloaders used.

Linux users were the most generous when compared to Windows and Mac game players. In fact, Linux users were often willing to pay more than Mac or Windows users when given the chance to name their own price for the for-purchase game software.

"For each of our Humble Bundle promotions, we have seen that on average Linux users are twice as generous -- if not more -- than Windows users, with Mac users falling in the middle. Linux users have also accounted for nearly a quarter of Humble Bundle's revenue," John Graham, cofounder of Humble Bundle, told LinuxInsider.

Vapor Market or Growing Trend?

Humble Bundle's marketing 6 Ways to Use Social Media for Business. Free Guide. success with its pay-what-you-like pricing strategy may be nothing more than an anomaly. But with the rapidly growing popularity of Android, Linux users looking for a unique gaming challenge might be on the vanguard of a new pay-for-Linux marketing trend.

"Despite traditional arguments to the contrary, it is clear to us that there is a serious Linux gaming market out there, and we will continue to support Linux in the future," said Graham.

Some industry insiders who ponder buying and usage trends do not see Graham's pay-for-Linux games' success as anything but a passing fancy.

"Well, pure FOSS PC Linux on the desktop is a pretty small share of the [U.S.] market as far as I can tell, so I think that limits the upside even if it's possible that users would pay more in some cases," said Lewis Ward, IDC's research manager for consumer markets: gaming.

Want to Share?

In a survey conducted in the third quarter of 2010, Ward found only 1 percent of gamers used the Linux platform. That compared to more than 75 percent for the Windows platform and 4 percent for Apple (Nasdaq: AAPL) machines, he told LinuxInsider.

For a game developer who needs a base of users from which to operate, it does not matter how wonderful the platform is, he concluded. "It is just not a large enough base to say, 'I'm going to devote x amount of budget' to develop for a market that is less than 1 percent of the total."

Linux Players Don't Talk Money

Unlike other marketing strategies such as shareware, Linux users are less accepting of paying for software. Open source programs are expected to be free. Free is part and parcel of using Linux. Paying for technical support for a free operating system, however, is different.

"Right now, it looks like Linux users don't really like to pay for anything, and there isn't a vendor doing a general consumer package, which is significantly the potential market anyway," Rob Enderle, principal analyst for the Enderle Group, told LinuxInsider.

The big problem with Linux users is their aversion to paying for anything, he added -- so for Humble Bumble's developers to get customers to pay for Linux games is in itself pretty amazing.

Android Antithesis

That said, a trend in paying for preferred gaming apps might be more productive on Linux-related portable devices. And the Linux derivative OS, Android, might well be the platform for change when it comes to Linux gamers parting with their cash.

"Gaming is all about numbers, and the developers are already building for Windows, iOS and Android," said Enderle.

Even programmers developing games and other apps for Android devices are not assured of having a large paying market, though. And since Android does not run on Linux desktops and laptops, game players might be willing to pay for an Android game but still be tightwads when it comes to a Linux distro equivalent.

Unfortunately, the Android platform, which is closest to Linux, is only providing a return for those who have figured out how to get advertising revenue. Android users do not appear to want to pay for apps, according to Enderle.

Show Me the Money

Unless they are willing to take a gamble, game developers have little incentive to enter the Linux game market. The market needs a strong scent of money for developers to make significant dollars and cents from Linux gamers.

"Until someone can do a Linux-based product that can demonstrate a sustainable revenue model for a game developer," said Enderle, "it is unlikely that it will get much interest by game developers who are already overcommitted and not that happy with the closest derivative, Android."

Who knows? If Linux players show they are willing to pay for games not available anywhere else, it might take off. So far, however, there is no track record for Linux users being willing to pay for games, he maintained.

A Place to Market

A major complaint about the lack of sales among Android app developers, in general, is volume. They can only make about one tenth of what they draw in the App Store. That should not be a big surprise, however. The Android Market is built around the advertising model, according to Enderle.

Still, despite Android's comparatively better market potential for game developers, a Linux-only outlet might pose an attractive opportunity for the right type of game. That advantage could be for those game developers who target Linux users, suggested IDC's Ward.

Still, it just doesn't have enough volume to warrant the necessary expenditures for most game developers, he said.

Roll the Dice

The Linux game market is not lucrative for a major player like an Atari or Activision, but a smaller game company might stumble on a title that will sell a million more than it would have sold otherwise, Enderle noted.

A company could hit the right combination of game and market to provide enough revenue. You have to pick the target market for what it is you want to do, he said.

"I certainly wouldn't recommend tying the state of your company to pure free and open source software on the Linux desktop," said Ward. "If it's inside a browser, or if it has some Java platform so it's completely agnostic in terms of the operating system, then it doesn't matter."

Computing » New Phones, New Carriers, New Lines: Apple's Future Is Wild

Posted by echa 5:03 AM, under | No comments

New Phones, New Carriers, New Lines: Apple's Future Is Wild | New Phones As Apple fanatics wait with bated breath for an iPhone 5 announcement, vague reports of a completely new Apple product line bubbled to the surface. Elsewhere, Apple asked iOS developers to give users a little more privacy, and the iPhone may take on yet another U.S. wireless carrier soon.

As the summer draws to a close, Cupertino-watchers grow increasingly anxious for the big announcement proclaiming the iPhone 5's release date. New hardware is almost certainly on the menu, and a new U.S. carrier may join the iPhone family as well. Sprint (NYSE: S) will soon start carrying Apple's (Nasdaq: AAPL) smartphone, according to a report in The Wall Street Journal, which cited unnamed sources.

However, it may still be several weeks before rumors and unconfirmed reports turn into solid dates and product details.

Some Apple hounds were so anxious for news from the company this week that they contemplated vague reports from Asian suppliers that the tech giant could be introducing a gadget line so innovative and ahead of the curve that it could represent and entirely new product line. Reports didn't provide many details but speculated it could be a line of touch-enabled desktop-like computers.

"They're at a point, and a size and scale, where there always seems to be something coming, and the competition always seems to have something coming so they've got to stay ahead," Edward Zabitsky, principal and CEO of ACI Research, told MacNewsWorld.

Financial soothsayers, though, seem more focused on the company's future in areas like cloud offerings and its existing mobile lines.

"What's really happening is that we're moving toward light computing and these OSes that have been optimized and developed for mobile are really OSes for interface with the cloud. So over a long period of time they'll be migrating these light OSes -- that's the direction the industry is moving," said Zabitsky.

With the focus on the cloud, in particular its upcoming iCloud offering, Apple can really dig into its loyal, core user base to generate revenue and a truly streamlined product and consumer foundation.

"Apple is really focusing on the iCloud and Apple Me products. Apple can offer a free e-mail system, and I think people would rush up to have @me.com addresses. This is where it's headed, and it can be a pretty big thing," Hendi Susanto, analyst at Gabelli, told MacNewsWorld.

Apple did not respond to a request for comment.

Privacy, Please

As for iPhones, Apple hasn't revealed when customers can start lining up for their upgrades, but it did make one announcement regarding changes to the iOS platform when it debuts. Apple asked developers to refrain from using a Unique Device Identifier (UDID) when creating iOS 5 apps.

The UDID is used in multiple apps, usually involving ads and gaming, and can allow a developer or a third party to collect personal data about the user. That data can be helpful for developers as they broaden their apps and personalize ads or games for certain demographics, but many users have voiced their opposition to what they see as a personal invasion.

That access to information has led to PR headaches that apparently just aren't worth it moving forward, so it's asked developers not to include the data in upcoming apps.

As consumers become more focused on online security and preventing data theft, that practice may become the industry standard.

"Once you put GPS on a device and you build a platform to use the GPS, there are always people who will misuse it. We're becoming more sophisticated, not just the market but the handset vendors. It's just something that needs to be corrected. It's not an Apple-specific problem, but providing an interface with GPS is a dangerous kind of thing with a few developers," said Zabitsky.

Since privacy concerns exist across the board, this probably won't lead to an exodus of developers flocking to different platforms.

"I don't think it's going to result in any dent because I do think that other companies like Google (Nasdaq: GOOG) will also follow suit and this will still be an equal playing field," Susanto said.

Product Rumors

Whispers about the iPhone 5's release date remain focused on October. Since networks' high-speed 4G LTE technology started making its way onto the scene, the idea has been tossed around that the iPhone 5 will be the first Apple phone with 4G connectivity. Some reports have indicated that suppliers confirmed the company was testing with 4G networks, but as the release date draws closer, it seems less likely the iPhone 5 will come equipped with LTE.

"Apple always wants to be leading edge, but not bleeding edge. Their standard of quality is that you need 10 hours of battery life, and obviously those radios could cause a power drain. Apple, for now, it appears, have sought to side with caution making sure that the technology is up to their standards. It's a tradeoff," said Zabitsky.

Another product that is likely being held off so as not to compromise quality is the iPad 3. A while back, rumors suggested Apple's next tablet would be on shelves for the holidays, and although there were rumors it entered production this week, it probably won't be available for consumers until after the holiday rush.

"I think that the current Apple supply chains still couldn't meet a high production yield, specifically for displays, because Apple wants to provide a much higher resolution for its new iPad. It might be pushed to early next year," said Susanto.

Computing » Fighting the Good Global Cybercrime Fight: Q&A With Security Guru Mikko Hypponen, Part 2

Posted by echa 4:57 AM, under | No comments

Fighting the Good Global Cybercrime Fight: Q&A With Security Guru Mikko Hypponen, Part 2 "First, back up! It should be off-site, whether in the cloud or in a removable disk that you take to your grandmother's house. It doesn't have to be daily, either -- for most home users, a backup from last month is good enough. Failing that, though, if something happens, you'll lose a part of your life."

Mikko Hypponen has spent the past 20-plus years studying malicious software, including everything from "Brain" -- the first PC virus, dating back to 1986 -- all the way up to Stuxnet and today's most sophisticated global malware.

He's widely considered one of the world's foremost experts on information security, and he's played a key role in taking down numerous international rings of cybercriminals.

TechNewsWorld recently had a chance to speak with Hypponen about his views on the need for a new model of law enforcement in order to fight global cybercrime effectively. That discussion is presented in Part 1 of this two-part series.

Given the ongoing debate about the relative merits of the various operating systems and platforms when it comes to security, however, we asked him to share his thoughts on that topic as well.

TechNewsWorld: It seems the majority of malware targets Windows. Do you think that's just because of its ubiquity, or is there also something about the technology that's weaker?

Mikko Hypponen: It's a complicated issue. If we separate computers and smartphones, we have computers running Windows, OS X and Linux on one side, and we see much the same spread on the smartphone side with Windows Phone, iOS and Android.

On the computer side, Windows gets almost all the attacks, but on the smartphone side, it's Android that is getting hammered. Windows, meanwhile, doesn't get targeted at all on the phone side.

It's really not a fair comparison, though, because it's mostly about market share. We find more computer malware every day, but it's unfair to consider Windows as one group. In fact, we really have Windows XP, Vista and Windows 7.

Of the three different versions of Windows, OS X and Linux, Windows XP is definitely the least safe. It's 11 years old, and it also has the biggest market share, with 50 percent globally -- Win 7 has just 20 to 23 percent.

Attackers have never had it so good. Not only is XP the weakest, but it's also the most popular. Attackers have low-hanging fruit to enjoy as long as there is such a huge target.

TNW: How would you compare Mac vs. Linux vs. Windows for their ability to prevent or mitigate attacks? Which would you recommend?

Hypponen: For the average beginner user, I'd recommend a Mac. It's easy -- easier to maintain than Linux, and the likelihood of getting infected is much lower than with Windows. Macs represent just three to four percent of the market globally.

But any feeling of superiority for Mac or Linux users is not the right attitude. They also have problems with phishing and spam -- those target everyone.

TNW: How would you compare open source vs. proprietary software in general in this context?

Hypponen: The truth is that pretty much nobody looks at source code and tries to find bugs. In that way, the 'theory of many eyes' doesn't work.

What is the big difference with open source software, however, is that when any vulnerabilities are found, anybody can fix it. When the code is closed, on the other hand, only the vendor can fix it.

We see open source apps getting targeted all the time, such as Firefox and Chrome. So do Flash plug-ins, etc. The practical differences aren't that large, but with open source, the fixes are generally available much faster.

TNW: For greatest security, which operating system should a person use?

Hypponen: For beginners, I'd recommend a Mac, as I said. For expert users, though, I'd say some Linux distribution, or if you prefer Windows, 64-bit Win 7. There is a big difference between the 32-bit model and 64-bit Windows, such as in loading drivers.

Of course, if you really want to split hairs, you could argue that the version of Windows inside the Microsoft (Nasdaq: MSFT) Xbox 360 is the most secure. The only networking is encrypted IP6, for example.

Of course, it's not really a fair comparison, since it's inside a console. Win Phone is also very secure, but it's also much more closed.

TNW: What other steps do you recommend to keep users' computers and data safe?

Hypponen: First, back up! It should be off-site, whether in the cloud or in a removable disk that you take to your grandmother's house. It doesn't have to be daily, either -- for most home users, a backup from last month is good enough. Failing that, though, if something happens, you'll lose a part of your life.

Also, make sure you're up to date with the latest version of the software, regardless of the operating system you use.

If you're on Windows, run an antivirus and use a separate firewall. If you're on a laptop and use WiFi hotspots, make sure you have some kind of VPN.

Say you're working at Starbucks (Nasdaq: SBUX). I recommend using VPN even if you're just using Facebook, simply for the encryption. Then it doesn't matter if someone else in the coffee shop is snooping. Home users, especially, tend to ignore this completely.

Finally, especially if you're on Windows, make sure you're really running an antivirus, and don't just think you are. It's become a standard feature for malware, if it manages to bypass your antivirus one time, to uninstall it and replace it with something else.

So, just because you installed some antivirus software half a year ago, don't assume it's still working. Double-check to make sure it works and that it's still updating.

Computing » Pybackup Makes Saving and Restoring Easy as Pie

Posted by echa 4:52 AM, under | No comments

Computing » Pybackup Makes Saving and Restoring Easy as Pie Not having a backup strategy for your computers is much like not having anti-intrusion protection. For any solution to work well, you must have a product that provides what you need. Then you must actually use the product regularly. That is what I like about pybackpack. I set it to update my backup file set at specific time intervals. Then if I should need to restore my system files, I click the restore button, and it is done.

Sure, we all know that making regular file backups is an essential survival task for frustration-free computing. But backing up data and backing up computer systemfiles are not entirely the same things. Doing one without the other is like having an uninterrupted power supply (UPS) that's not connected.

Pybackpack Backup | PybackpackFor instance, you no doubt have multiple copies of your critical data. But how many copies do you have of your Linux desktop home folder? Chances are that unless you have already been burned by a system crash, you see no need to back up your system files.

Rather than suffer that burning sensation, get preventative relief with File Backup Manager. At least, if you run Ubuntu Linux, that is what Canonical's distro calls this backup option. Its more generic name is "pybackpack." It is an easy-to-use file backup tool written for the Gnome desktop environment released under the GPL.

Most distros, including Ubuntu, do not bundle this application as the resident default choice. But it is easily found in the package manager systems without having to fuss with archived envelopes.

I have used two other backup solutions to duplicate critical files. Both are fine but work differently than File Backup Manager/pybackpack. Keep and Back In Time take care of my working data and system configuration.

Either one does the job without any hassles. But I am a redundancy fanatic. And I work on a collection of laptops and desktops. Each one has a slightly different setup. File Backup Manager/pybackpack makes the task of backing up my home directory on each system a no-brainer.

Basic Strategy

Not having a backup strategy for your computers is much like not having anti-intrusion protection. For any solution to work well, you must have a product that provides what you need. Then you must actually use the product regularly.

That is what I like about pybackpack. I set it to update my backup file set at specific time intervals. Then if I should need to restore my system files, I click the restore button, and it is done.

A favorite alternative OS I use on each computer is Puppy Linux. It loads from an encrypted boot CD or USB drive. All system files and data are saved within a special configuration file. So for each Puppy Linux installation, all I have to do is copy the save file to my external storage drive.

Using pybackpack lets me do essentially the same thing automatically with other distros. It took a few more steps to convert the downloadable compressed package into a Puppy Linux package. But once done, I was able to have a redundant backup procedure that works the same on all of my dual boot configurations.

File Backup Manager Primer

Now in version 0.5.8, this application has been on the slow upgrade path. But its current state contains all that it was designed to do. So little need for improving it exists.

It was originally written by Dave Arter as part of the Google (Nasdaq: GOOG) Summer of Code in 2005 for the Fedora Project. Afterwards, it fell into a short period of inactivity, but now Andrew Price maintains it.

A few key things I like about the app's design that contribute to its ease of use. One is the simple, direct user interface which is actually a graphical front-end to the rdiffbackup application. I particularly love intuitive design that requires no documentation or trial and error to figure out.

Another is being able to give unique names to each backup set. This lets me store all the backups in one folder on an external drive and tell each one from the others. I can easily maintain sets of files and directories to specific computers and run the backup process many times without losing track of them. I can also run incremental backups.

External Drive Support

Pybackpack is registered to the System/Administration menu. It backs up files to remote locations over the network, local file systems,optical media such as writable CDs or DVDs, as well as attached external media.

The attached part is critical. On first blush it looks as if the application only supports an internal CD/DVD drive. But if you plug in the external hard drive or optical drive before loading the backup program, you can see it in the location picker window. But you have to drill down through the file system to get to it just like pointing to a file or device location in a file manager task.

Computing » PCs Hit 'Big 30' - Next Stop, Boneyard?

Posted by echa 4:45 AM, under | No comments

Computing » PCs Hit 'Big 30' - Next Stop, Boneyard? Just because a technology has become commoditized doesn't mean that it has lost its vital edge or potential for innovation. What really seems to be IBM's (and now HP's) focus on life after PCs is an emphasis on developing and delivering the IT infrastructure beneath consumer and business computing.

Aug. 12, 2011, marked the 30th birthday of the IBM (NYSE: IBM) Personal Computer (PC) -- an event marked in numerous congratulatory and cautionary articles and blog posts. In the days since then, PC-related news has remained thick on the ground.

Most shocking, perhaps, was HP's (NYSE: HPQ) announcement that it was "looking at options" (i.e. sale or spinoff) for its PC business. Then a new report from IDC said Q2 2011 PC sales in China surpassed sales in the U.S. for the first time, suggesting a trend that would see China's full-year PC market pass by the U.S. in 2012.

So, is the PC market dying or very much alive? Are vendors like HP smart to be pulling back, or should they stay in the game?

Is Innovation Device- or User-Dependent?

One article considering the IBM PC's birthday was written by Mark Dean (now CTO of IBM Middle East and Africa), one of the dozen engineers who designed the first IBM PC. Dean noted his pride in those accomplishments but also admitted being proud of IBM's decision to leave the PC business and to sell that division to Lenovo in 2005.

That move, Dean said, reflected IBM's position at the "vanguard of the post-PC era" and said that while PCs will continue to be much-used, they no longer represent "the leading edge of computing" and are "going the way of the vacuum tube, typewriter, vinyl records, CRT and incandescent light bulbs."

How reasonable is Dean's conclusion? Consider this: Since their introduction three decades ago, PCs have enabled and inspired
  • computation -- typically involving numeric and/or data intensive processes and applications;
  • communication -- including consumer/business processes such as email, as well as Web-based technologies like IM, VoIP and video conferencing;
  • creation -- such as traditional text-based productivity applications, as well as increasingly sophisticated graphics tools and photo/video production; and
  • consumption -- across a range of processes and markets, including online commerce and fulfillment, audio/video purchase and playback, and many more.
One could argue -- successfully, I believe -- that these four points remain the essential pillars and value propositions of personal computing and PCs, as well as PC-like devices with new form factors like increasingly powerful tablets and smartphones, and even stunningly popular social networking 6 Ways to Use Social Media for Business. Free Guide. services and sites.

In other words: The PC is dead. Long live the PC.

Has the Post-PC Era Really Begun?

Dean is hardly the first person to espouse the notion of the "post-PC era." In fact, it's an increasingly common pronouncement in the industry, particularly among those promoting tablet devices. But market dynamics make those declarations problematic.

PC sales appear to be growing despite expanding enthusiasm for tablets and smartphones. Knowledgeable analysts and vendors pegged 2010 PC shipments at roughly 1 million units per day (360M+), and early estimates suggest that 2011 sales could well be higher.

Moreover, as noted in the new IDC report, emerging markets including China, India and South America -- where tablets have far less mindshare than they do among well-heeled U.S., European and Asian users -- are enjoying robust PC sales and will likely continue to do so. In essence, the PC market isn't anywhere near peaking, let along retreating.

Additionally, PC vendors and component manufacturers aren't simply ceding the market to Apple (Nasdaq: AAPL). Among the iPad's greatest differentiators, at least initially, were its lightness and battery life. Notebook vendors have steadily whittled away at those advantages but they should also benefit from Intel's (Nasdaq: INTC) "Ultrabook" initiative, which stresses development of notebooks combining high performance, all-day battery life and tablet-like features.

Intel says Ultrabooks will constitute 40 percent of all notebook designs in 2012. That's an aggressive outlook, especially since OEMs' execution is crucial. But if Intel and PC vendors succeed, they will demonstrate the industry's ability to rapidly adapt to radically shifting user requirements. This is no new thing. Most PC component makers and vendors are well-attuned to changes in attitude among consumers and businesses -- they wouldn't last long, otherwise.

Death or Rebirth by Commoditization

That brings us to Dean's comments about IBM's 2005 sale of its PC division to Lenovo, and HP's apparent interest in following suit.

In FY2010, a year enlivened by new PC technologies like Microsoft's (Nasdaq: MSFT) Windows 7 and Intel's Core processors, HP's Personal Systems Group (PSG) drove revenues of US$40.741B and operating profits (OP) of $2.032B. By comparison, the company's Enterprise Storage and Servers group delivered greater OP ($2.402B) than PSG with less than half its revenues ($18.651B). Printing and Imaging made $25.764B in revenues and $4.412B OP, and Enterprise Services' $34.935B in revenues resulted in OP of $5.609B. In fact, while PSG accounted for nearly a third of HP's $126B FY2010 revenues, it only contributed about 16 percent of the company's $14.4B in OP.

The point here is not to slam HP -- the company continues to lead the PC market in overall share, and $2.032B in operating profits ain't spare change by any measure. Instead, it demonstrates the practical reasoning underlying HP's and IBM's decisions to leave PCs behind for greener, higher margin pastures.

Final Thoughts

However, just because a technology has become commoditized doesn't mean that it has lost its vital edge or potential for innovation. What really seems to be IBM's (and now HP's) focus on life after PCs is an emphasis on developing and delivering the IT infrastructure beneath consumer and business computing.

That makes perfect sense, both from business and technological points of view for IBM. Inventing the PC was something of an aberration for the company, which has focused on business-related technologies, tools and services since its inception. Even its signature PC product -- the Thinkpad -- found its greatest success among business users.

As such, both IBM's invention of the PC and the decision to leave it behind are all the more impressive, showing the company's willingness to move beyond what were once core markets and products. That demonstrates another considerable strength -- the ability to focus on the future rather than dwelling on the past. Both moves qualify as examples of an organization that knows itself and deeply understands its core strengths. Both deserve to be celebrated.

But IBM's conscious shift toward IT infrastructure also creates something of an irony related to Dean's comments about PCs "going the way ... of incandescent light bulbs."

Thomas Edison, who is best remembered for his improvements to the electric light, also founded the Edison Illuminating Company in 1880. That company was created to commercialize his inventions and capitalize the use of incandescent bulbs by developing and managing power production and distribution infrastructures.

In 1890, Edison brought together numerous business interests under the Edison General Electric (NYSE: GE) Company, which became the General Electric Company (GE) in 1892. In 1896, GE was one of the 12 companies listed on the original Dow Jones Industrial Average and, 115 years later, is the only one still in business.

GE remains one of the world's premiere corporations, is a major player in numerous industries, including electrical infrastructure equipment, and to this day continues to bring incandescent light bulbs to market.

With all due respect to those forecasting the demise of personal computers, the bottom line to this observer is that PCs and the PC Era have quite a few birthdays ahead of them.

Related Posts Plugin for WordPress, Blogger...