Like Us On Facebook

Change the way of webdesign Why so serious? Andere Spuren

  • ZerO-TecH

    Technolgy is updating everyday.....
  • ZerO-TecH

    World change everyday.....
  • ZerO-TecH

    Are you ready for update your knoledge....
  • ZerO-TecH

    So come walk with us.....
  • ZerO-TecH

    we show you the whole world.....

Powered by Blogger.

Followers

Categories

Apple (19) gameplay (11) iPhone (10) Smartphone (9) Microsoft (6) Technology (6) Green Tech (5) iPad (5) Android (4) Smartphones (4) Gadget (3) Galaxy (3) Mac (3) Windows (3) Computer (2) LG (2) Motorola (2) Samsung (2) Tabs (2) Amazon (1) BlackBerry (1) Community (1) Digital (1) Enters (1) Googles (1) Huawei (1) Kindle (1) Nexus (1) Nokia (1) Nvidia (1) Petzval (1) Portrait (1) Software (1) TMobile (1) iPod (1) iWatch (1) tablets (1) xbox (1)

Saturday, August 3, 2013

Rockstar’s Agent is still alive, GTA V PC petition nears quarter million PS3 exclusive title, Agent has been in development limbo for quite some time over at Rockstar. The game was announced back in June 2009 during Sony’s E3 presser.
It seems like we revisit this title every year with no new information about the game. Take-Two did update investors that this game was indeed still in development back in 2011. The game was brought up during Take-Two’s Q1 2013, earnings call, CEO Strauss Zelnick stated that the publisher was not ready to announce anything about the title at the time.
Fast forward to this year, head of Sony Worldwide Studios, Shuhei Yoshida was asked if Agent would be coming to the PS4 instead. Yoshida stated, “You are asking the wrong person. I have some knowledge, but I’m not in a position to talk about it.”
Rockstar’s Sam Houser previously described the game as being set during the Cold War and "the world of counter-intelligence, espionage and political assassinations.”
Just this week, Take-Two renewed its trademark for the game Agent. You can also find the official Agent webpage up on Rockstar’s site currently.
In other news, Grand Theft Auto V is starting to gain momentum with PC gamers turning to Rockstar to bring the game.
While Rockstar has refused to clearly state if the game will be coming to the PC or not, recent job listings did call for an experienced engineer for porting code over to the PC. There is almost a quarter million member who have signed up on the petition.
Members interested to submitting this. Rockstar previously stated that a PC or Wii U version is still up in the air. Stay tuned for more details.
Posted in Games, Industry, PC, PlayStation 3, Xbox 360 | 16 Comments » Read more from Mike Ferro
You must be logged in to post a comment. Don't have an account? Register today!

View the original article here
The Raspberry Pi mini computer starts at $25.Imagine a computer slightly bigger than an Altoids box that only costs $25.  These tiny computer has been developed for kids to make it easy for them to learn how to program using Linux.  The computer hooks up to a TV and a keyboard, plays Blu-Ray discs, and runs Fedora, Debian and ArchLinux.  Hopefully kids will play around with programing like they used to in the 80’s and 90’s.
The Raspberry Pi Foundation is a UK charity.  The actual idea for the Raspberry Pi mini computer started in 2006 with Eben Upton, a lecturer at Cambridge University.  One of his roles also included admissions.  He noticed that the experience of applicants interested in Computer Science had changed drastically.  Rather than kids who were playing around with programming on the family computer, the current applicants had little to no experience with computer programming.
He and fellow colleagues from the university like Rob Mullins and Alan Mycroft started pondering how to get kids programming again.  Upton started building prototypes of Raspberry Pi.  In 2008, processor chips designed for mobile devices became powerful enough and cheap enough to provide good multimedia support.
The current models are the size of a credit card. An illustration of the computer can be found below. The minicomputers will do about everything a regular computer will do including word processing, spreadsheets, gaming, and playing videos.  Both models have the same components but different amounts of RAM.  The A model has 128 MB and the B model has 256 MB.  To put this in perspective, the latest cell phone models have one gigabyte of RAM so we aren’t talking computers that will do a lot of multitasking.
According to EcoGeek, the performance of the computer will be similar to a 300 MHZ pentium processor.  The actual processor in use is an ARM based system on a chip.  These should be easy to power up since they will run off of four AA batteries.  The model A will require only a 300 mA charger and the model B a 700 mA charger.  Solar power is also an option.
As mentioned earlier, the idea behind these computers is to provide a small affordable computer for kids to learn programming. The foundation has gotten a lot of interest from the educational community and developing countries.  Inquiries have also come from museums and hospitals who want to use the Raspberry Pi to run displays. Of course, people interested in building robots are interested in these little boards. While the computers are aimed at kids, they sound like they might be a lot of fun to play with for adults as well.
The first production run will be 10,000 and will go on sale at the end of this month.  Orders are limited to one per person.  The Raspberry Pi will come uncased and can be ordered from raspberrypi.com.  The model A will cost $25 and the model B, $35.
The Foundation does expect to offer a buy one, donate one option at some time in the future and as with all charity organizations, they accept donations.
Raspberry pi-Model-
Tags: computers, Fedora, Linux, Raspberry Pi Posted on: February 24th, 2012 by Susan Wilson
View the original article here
Steven Cherry: Hi, this is Steven Cherry for IEEE Spectrum’
s “Techwise Conversations.”
One day, about 25 years ago, an engineer at Intel was trying to install a multimedia card into his computer and came up with the idea for a universal port for peripherals. The engineer was Ajay Bhatt, and the port became USB.
Today, there are by one count 6 billion USB devices in the world, and not just printers and keyboards and mice. There’s a USB butt cooler for your chair, USB heated gloves for your hands, and a USB disco ball for your inner John Travolta. Ajay Bhatt wasn’t kidding when he envisioned a universal
serial bus port.
Ajay Bhatt hasn’t stood still, and neither has USB. There’s a USB 2.0 and 3.0, and Bhatt has worked on the Accelerated Graphics Port and PCI Express. He’s worked on Intel’s desktop power-management architecture and is now helping to develop a computer that will work all day. He’s my guest today.
Ajay, welcome to the podcast.
Ajay Bhatt: It’s nice to be with you, Steven.
Steven Cherry: You ended up with collaborators from six other companies: Compaq, DEC, IBM, Microsoft, NEC, and Nortel. How did you end up with so many cats to herd?
Ajay Bhatt: Well, the thing is, even though we started with, you know, sort of a silicon company, we started with a certain view of the computer. We wanted to be inclusive. We wanted to make this port universal. We wanted to comprehend requirements from system vendors and peripheral vendors and software developers. So by gathering a group of diverse folks, it made USB even more successful, because we got, you know, input that we wouldn’t have from Intel’s side.
Steven Cherry: It really is universal, and I was sort of joking about the butt warmer, although it does exist. But there’s different pin settings for different applications—printers, smart cards, audio/video. There’s one for health care. Does that get used?
Ajay Bhatt: Yes. Actually, there are some companies that make glucometers, and there’s some equipment where USB is used, particularly to upload and download information. Just the way you upload and download information from a smartphone, for example.
Steven Cherry: I love the convenience of USB, but I have to say it drives me crazy that it has an up and a down. You’d think that I’d get it right at least half of the time, but I don’t think I do.
Ajay Bhatt: That’s an interesting point. If I were to do this all over again, that’s one thing I would fix. When we started, though, I’d like to remind you that some of the ports were such that the degree of freedom was about 360 degrees, and blind mating a cable to a port was very difficult. So when we started, by taking the connector we had, we made a big jump, an advancement. However, the limitations we had were with respect to the cost. So we found the cheapest connector that we could afford at that point, and that’s what we’ve ended up living with. But as we go forward, I think we may get an opportunity to fix that problem as well, at some point in the future.
Steven Cherry: I’m glad you mentioned cost, because I wanted to ask you: FireWire, IEEE 1394, already existed in 1995, and it was much faster. Did USB win entirely on price, or were there other things as well?
Ajay Bhatt: Well, a few other things as well, but when we started, we reached out to everybody in the computer industry with the view that we had. We even made an attempt at approaching people to revise their specs to meet our requirements. So there was a real effort made in earnest to sort of bring people together. But one of the big things that we also focused on, besides cost, was an architecture that was open, that was widely available to developers, royalty-free, and at no cost. So we developed this technology and made it available.
Now, such terms were not available for all the other interfaces that were out there. They had a different business model. Our view was to really promote an open standard that would enable new users of the computers, and attract new users, because computers would become much easier to use.
Steven Cherry: Apple turned out to be the leading user of FireWire, and it seems like they keep doing this again and again. I’m wondering what you think of their Lightning, which I guess is similar to USB 2.0, with some proprietary extensions.
Ajay Bhatt: I haven’t looked at Lightning in much more detail, but I think the problem that they’re trying to solve for all Apple ecosystems is to make—I guess one of the most advanced features of that connector is that it’s flippable, right? So the problem that you were talking about with USB, which only goes one way, they’ve sort of solved the problem. I think the other problem that they’ve fixed is the size. The connector that they’ve chosen for Lightning is a very, very small size. And when you have very portable devices, size does matter.
Steven Cherry: What about Thunderbolt, which I guess is also sort of an Apple port technology—and I guess it’s also a superset of one of yours, that is to say, PCI Express.
Ajay Bhatt: Yeah, so I’m on one of the—two of the primary patents on Thunderbolt. I actually was the original guy who worked on Thunderbolt for a while before I moved over to a different job assignment at Intel. But Thunderbolt basically is addressing a need of supporting two protocols: display port or very high-resolution displays. And PCI Express, in certain of these cases, you want to desegregate IO and take it outside the box. And those are the two needs that Thunderbolt meets. So it is targeted toward a specific segment of the market, and I think a specific set of applications.
Steven Cherry: Is it the next big thing, or will there be a USB 4.0?
Ajay Bhatt: Well, I think both these, Thunderbolt and USB, will keep evolving. I know USB IF, or Implementers Forum, has announced their intentions for extensions, so I expect USB to be extended. And so will Thunderbolt be extended as well.
Steven Cherry: And the main point of extending USB would be the large devices? Especially being able to charge them, right? Like monitors?
Ajay Bhatt: Well, a couple things: One is, there’s a recent extension to USB—it’s called USB Power Delivery, or USB PD. And USB PD is an extension that allows a power source and a computing device to negotiate power delivery mechanisms, or the voltages and the current. So that’s the spec that enhances the way we charge devices, or the devices give charge to external peripherals.
So USB PD is one of the extensions. The other things that are happening are related to the data rate. As you can see, as we go to devices that store more and more data—for example, I have SLR camera. When I shoot RAW, each picture is about 25 to 30 megabytes. Now, to transfer those pictures from the camera to the computer takes a long time using USB 2.0. Using computer extensions, we would have the speed in a few seconds, rather than a few minutes.
Steven Cherry: Maybe you could just tell us about the all-day computer: Why has Intel put its best engineer on the case there?
Ajay Bhatt: Well, one of the major pain points that we all have using portable computing, and for me it is tablets and laptops, is the power efficiency. You know, even though in the past people have claimed that it has six-hour battery life, in normal use you’re lucky if you get half that, right? So that was clearly the pain point for people: They want to use the computer all day long; they are sort of tethered; they have to bring the power brick with them. And I delivered the advances made in silicon technology, as well as the design techniques involved in architecture, so that we can actually fix this problem.
So in the last four years or so, I’ve spent a lot of time looking at all aspects of PC architecture and sort of audited each part of the architecture and systematically gone in and tried to fix the issues in the system that would result in draining the power unnecessarily. So we’ve really done a big overhaul in PC architecture to enable PCs to now run all day on a single battery.
Steven Cherry: In some ways it’s really just a weight problem, right, not a power problem? I mean, if we were just willing to carry computers, you know, that were state-of-the-art 10 years ago, we would have all-day computers, wouldn’t we?
Ajay Bhatt: Well, that’s not the right way of solving the problem. You know, we didn’t want computers to be luggable. We still want computers to be very sleek, lightweight, attractive, yet have a good dynamic range when it comes to performance. So adjust the power consumption of the computer based on the tasks that you’re doing. Of course there are certain tasks that are very, very demanding, and we want to provide that level of processing power to support the most demanding application. At the same time, when you’re doing some simple browsing or word-processing kind of tasks, then we want it to be much more power efficient. So we want to provide power efficiency while maintaining reasonable size and weight and temperature of the device.
Steven Cherry: Up to now, producing light, sleek, attractive computers was a sort of a differentiator for a manufacturer, and I find it interesting that Intel is now trying to solve this problem for the industry as a whole. And you’ve said that the point of USB is that the companies should not be competing at the level of infrastructure. But I guess I’m wondering, how do you know what’s infrastructure and what should be a manufacturer’s value-added?
Ajay Bhatt: That’s a very good question. Usually when the buses interconnect, where two sides of the interconnect are used by two or more parties, that becomes an infrastructure issue, right? Because when you want to communicate with other devices, you’d better agree with the rest of the engineering community on the specs. But then once you define the rules of communication, how efficiently you communicate, or if you can create a much more efficient implementation or architecture, then that becomes the differentiation.
It’s very similar to building a highway by agreeing on the size and the lane width and what have you, but then building a car or set of cars that run on that highway is akin to building a differentiated computer device, right? So may that be a wireless or wired connection, there’s certain aspects of architecture where you need to be open with respect to architecture and specification, such that anybody can build interoperable devices, because ultimately these devices will be used by users, and you don’t want them to be frustrated. Because if each company does proprietary design, then these devices don’t interoperate. So for the sake of interoperability, you must invest in common infrastructure.
Steven Cherry: So what’s the interoperability issue when it comes to the all-day computer?
Ajay Bhatt: Well, all-day computers are, remember, in the computer we have components from a variety of vendors, right? So if we all agree to a common set of power privileges in this case, then we know when the devices need to wake up and when they go to sleep, otherwise a misbehaved device could actually keep the rest of the computer on and drain the battery. So even when it comes to all-day computing, each devices, along with the software, have to be able to communicate with each other with messages that communicate the state of the machine with various substations.
I think that’s where some of the work we’re doing is invaluable, because you have to agree to entering and exiting certain power states at the right time without actually being visible to the end user. So if the end user clicks on some application, appropriate behavior should be demonstrated. However, underneath, the subsystems that are not being used have to be powered down for however long they’re not needed. And then they should wake up transparently to the user, so there’s a tremendous amount of work that needs to be done inside the computer to make it much more efficient.
Steven Cherry: I wanted to ask you a question about Intel in general, and I thought I’d compare it to some other organizations. Bell Labs used to be a place where people could do university-style research without having to teach, and today Microsoft Research is like that. But it seems like a lot less makes its way out of the lab and into products—the Kinect, maybe, being the notable recent exception—than does at Intel. Does Intel have some secret sauce when it comes to generating immediately usable R&D?
Ajay Bhatt: Well, so, we have a rather large investment in Intel Labs. Now, Intel Labs does various things. One, we have close relationships with academia. So we have Intel Science and Technology Centers at universities around the world, including various cities in the U.S. So Intel researchers, along with university researchers, collaborate on certain things.
We also, at Intel, have researchers who focus on fundamental research, may that be in process technologies, architectures, or design technologies. And they take a long-term view of technologies that will be needed in the future. And then we also have a view where we bring together people from different disciplines to come together and develop something called Rapid Prototypes. And these prototypes, or these technologies, are based on fundamental research that we may have done, and then based on that, we create new technologies that can be deployed in our future products. So we work with universities, do fundamental research as well as applied research, I would say, and a lot of these ideas make it into the product.
Steven Cherry: Well, Ajay, Intel did a somewhat silly TV commercial that featured you as a technology rock star, and Conan O’Brien did a version of that that managed to be even sillier, although it was hilarious. And you were in it, and it seemed like you really enjoyed it. I wonder, though, if you find it more humbling than anything else to know that you’ve improved the computing efficiency of basically half the planet every day.
Ajay Bhatt: Well, you know, I feel extremely privileged to get this opportunity to leave my fingerprints on the computing industry, right? Clearly the vision that we had was, you know, we were at the right place at the right time, and we were able to assemble the right group of people to make USB happen.
And with respect to Intel’s commercial, well they were looking to highlight thousands of engineers that we have, and they just chose me as an example because people could relate to USB, and hence they asked that my name be included in that commercial. And that’s how I ended up on Conan, and clearly Conan really—it was sort of out of my comfort zone to be on Conan. But I went along with it, and it was a lot of fun.
Steven Cherry: Yeah, it does look like a lot of fun, but there’s nothing silly, to be sure, about the achievement of USB, and PCI Express, so thanks for those things, and thank you for joining us today.
Ajay Bhatt: Thank you so much for having me.
Steven Cherry: We’ve been speaking with Intel’s Ajay Bhatt about the past, present, and future of USB.
For IEEE Spectrum’
s “Techwise Conversations,” I’m Steven Cherry.
Photo: European Inventor Award
This interview was recorded Tuesday, 25 June 2013.
Segment producer: Barbara Finkelstein; audio engineer: Francesco Ferorelli

Read more “Techwise Conversations,” find us in iTunes, or follow us on Twitter.
NOTE: Transcripts are created for the convenience of our readers and listeners and may not perfectly match their associated interviews and narratives. The authoritative record of
IEEE Spectrum’s audio programming is the audio version.
View the original article here

Friday, August 2, 2013

Apple IGZO displays could improve MacBook, iOS battery lifeApple is rumored to refresh the MacBook Pro later this year, and a new report says that the company may get access to some exclusive Intel Haswell processors to power the 2013 MacBook Pro versions.
A previous report indicated that the new MacBook Pro models could be launched at some point in October, with a variety of indirect evidence also pointing to the imminent product refresh, including recent price drops for 2012 MacBook Pro models as well as certain estimates from analysts.
Meanwhile, SemiAccurate
has published a new report according to which Intel is going to provide Apple some unique “ultra-high performance” Haswell chips to be used in the new MacBook Pro models.
Apparently the new machines will get special on-board graphics, with Intel’s GT3e (Iris Pro 5200) said to be included in the new chips.
The company has reportedly asked for “a special top bin cream-of-the-crop GT3e selection from Intel, with ‘as much GPU power as possible,’” according to MacRumors
, and Intel is said to provide such exclusive parts to the company.
This wouldn’t be the first time Apple and Intel team up for exclusive products, or when Apple is said to be the first to have access to new Intel chips.
Needless to say, the report can’t be confirmed just yet, as the 2013 MacBook Pro refresh is yet to become official. However, if the report is accurate, then it may be safe to assume that other Intel clients will not get access to these special Haswell chips just yet.
MacRumors reveals that next-gen MacBook Pro models may have already shown up in benchmark test in June in July for the 13-inch and 15-inch upcoming versions of the laptop, respectively. According to those results, the new MacBook Pro models offer a performance similar to current models with the added benefit of improving power efficiency and therefore offering better battery life – just as the 2013 MacBook Airs.
Interestingly, the benchmark test for the 15-inch MacBook Pro seems to indicate there is no dedicated graphics card on board. It is believed that the new 15-inch MacBook Pro may rely solely on Intel’s integrated graphics power and not include a standalone graphics card, as previous models do.
For example, the 15-inch 2012 MacBook Pro models pack an Intel HD Graphics 4000 integrated graphics card, but also a NVIDIA GeForce 650M with 512MB or 1GB of GDDR5 memory depending on models. The 13-inch models only have an Intel HD Graphics 4000 graphics card. The same thing applies to 2012 Retina MacBook Pro models.
The new Iris Pro 5200 integrated graphics are apparently ready to offer extra performance, with Intel saying in its promotional materials that the Iris Pro 5200 paired with a Core i7-4950HQ chip (the same chip the 15-inch 2013 MacBook Pro model used in the benchmark test mentioned above) will offer 2-2.5 times the performance of an Intel i7-3840QM and Intel HD Graphics 4000 combo.
It is not yet clear whether the Retina MacBook Pro models will be refreshed alongside the regular MacBook Pro models, but it’s worth pointing out that Best Buy offered $200 off the 13-inch and 15-inch Retina MacBook Pro models during its Hot July Black Friday sale, a discount which could suggest the retailer may be interested in dumping existing stock before a product refresh.

View the original article here
Is Thorium the answer to our energy needsTo hear proponents talk about Thorium reactors, you would think that Thorium is the energy panacea for which we have been searching. This readily found element can be used to create nuclear reactors that are walk-away safe, with waste that has a much shorter half life and should be easier to dispose of.  Current, nuclear reactors need multiple redundant systems and can blow up as we’ve seen with Fukishima.  Thorium reactors won’t blow up and don’t need the multiple redundant systems.  If they are so great why are we still using Uranium reactors?
According to The Thorium Dream by Motherboard TV, it is because two major nuclear powerhouses want it that way.  The other reason mentioned was that the current reactors, using 60 year old technology, are what we are comfortable with and what we know works.  The fact that there have been major disasters like Chernobyl, Three Mile Island and, most recently, Fukishima have shown that that doesn’t make them safe and the results are devastating when they fail.
Enter Thorium as the miracle that will save us as fossil fuel supplies dry up and current Uranium reactors are viewed as too dangerous.  Rather than using solid fuel rods like light water reactors(LWR) do, Thorium reactors use a liquid Thorium salt mixture.  It doesn’t require redundant safety mechanisms in part because it doesn’t blow up.  Unlike Uranium, you can’t make bombs out of Thorium.
Richard Martin talked about Thorium on The Leonard Lapate Show.  According to Martin, the amount of Thorium needed to produce electricity is significantly less than needed in a Uranium reactor.  A liquid Thorium-Fluoride salt reactor is actually a breeder reactor where it creates more fuel as it producing electricity.  These types of reactors would require less maintenance and could run longer on the same fuel producing less nuclear waste.  Should something happen to the reactor it would not blow up.  At the bottom of the reactor is a and salt plug that would melt draining the radioactive fuel into a lead lined safety chamber. In other words we are talking about a type of nuclear reactor that is much safer than Uranium reactors, with less waste, and less maintenance.
A Thorium reactor was brought on line in the 1960’s but was shut down after 6 years primarily because market forces decided to continue focusing on Uranium reactors.  Watch The Thorium Dream
to get a better picture on why. While the United States may have taken a pass on these safer types of reactors, other countries like India and China are funding Thorium research and will probably have Thorium reactors before we will. 
Unlike fossil fuels, Thorium doesn’t produce any carbon byproducts which makes it cleaner even than natural gas.  It is readily available so one country or area of the world, think OPEC, can’t manipulate the cost.  It would not require such risky methods as fracking or trying to extract oil from shale using pollution producing methods.
All in all it looks as if Thorium reactors would actually help solve a number of our energy problems.  While renewable energy is continuing to grow, it is growing so slowly that we still use coal fired plants for much of our electricity.  Thorium nuclear reactors would produce cleaner electricity.  We would have cheaper electricity and could power our lives (including cars) using only a golf ball size of Thorium. 
The Thorium Dream will become reality.  Too bad it won’t happen here first.
More information on Thorium can be found here and here.
       Thorium-Flouride reactor
Tags: nuclear energy, Thorium, Thorium-Flouride Reactors Posted on: February 26th, 2012 by Susan Wilson
View the original article here
image001The President’s Council of Advisors on Science and Technology (PCAST) met on July 18, 2013 in a joint session with the Council for Science and Technology (CST) from the United Kingdom.  This was the first time the two similar bodies met together.  The first agenda item was Big Data: Smart Cities.  The Computing Community Consortium (CCC) has been involved in Big Data for quite some time, having convened a Big Data Study Group in 2008. The conversation began with Steven Koonin discussing The Center for Urban Science and Progress (CUSP) at New York University (NYU).  CUSP is a public-private research center that uses New York City as both its laboratory and classroom.  CUSP is leading the emerging field of “Urban Informatics.”  Koonin spoke about the rationale for this new field and provided suggestions for a national program:
The encouragement of data sharing across government functions and with the private sector;Data standards need to be defined;Privacy research and regulation must be furthered;Funding;Cross disciplinary training in undergraduate and graduate programs must occur;Partnerships must be formed;Urban Informatics Research needs to have a “home.”
Next up, Sir Alan Wilson spoke about the Future of Cities Project and Science of Cities and Regions in the United Kingdom.  For the Future of Cities Project, Sir Wilson will be looking at the system of United Kingdom cities and some demonstrator cities to answer some Big Questions, such as: What makes a successful city? His work aims to build theories to help all cities in the future.
After the presentations, the members of PCAST and CST asked several questions of the presenters.  The webcast of the meeting can be viewed here.

View the original article here
AT&T adds LTE for pre-paid customersIf you want a fast data connection without a contract, add AT&T to your list of options. The company is adding LTE to its pre-paid contracts, which are currently available in a limited area only.
AT&T launched a pre-paid service under the Aio Wireless brand name last month. It’s a contract free deal where you pay between $35 and $70 for unlimited voice, text and 3G data, plus a fixed amount of 4G data (the amount varying with the price.)
The service arguably launched a little earlier than would be ideal as it only offered HSPA+ rather than the more widely available LTE. That’s being fixed with an over-the-air update for compatible handsets.
The LTE support will also come ready-installed on new handsets, including the forthcoming ZTE Overture, a so-so looking Android phone.
Adding LTE support will be most important with the iPhone 5, the only real high-end handset available on the Aio Wireless deal. Analysts had noted getting LTE was pretty much essential if AT&T was going to compete with T-Mobile in the market for pre-paid iPhones.
Aio allows users to choose between paying the full cost of the handset up front ($649.99 for the iPhone 5), paying in installments by leasing it from a third-party firm, or using an unlocked handset brought from another network.
The big difference between Aio and the T-Mobile pre-paid deal is that the customer won’t have any form of credit agreement with AT&T itself and thus won’t have to undergo a credit check if they pay for the phone outright or bring their own handset.
Controversially, with T-Mobile the installments plan is linked to the phone service. The customer can stop taking the service at any point, but will then have to pay the rest of the handset cost immediately. With the Aio deal, it appears the customer can go elsewhere for service and continue paying the leasing firm in installments.
Posted in 4G, Android, AT&T | No Comments » Read more from John Lister
View the original article here