Updated with several more emails.

I have no clue about computer chip design and manufacturing. So I trolled VentureBeat’s readers with a challenge: Explain to me how Apple’s switch from third-party chip manufacturers to its own in-house design for the A4 chip that powers the new iPad tablet computer makes the iPad better, either for Apple or for iPad customers.

Former Apple employee Prabhakar Kotla sent me not one snappy answer, but ten of them.

“Instead of looking at A4 in a linear way  you must think in a multi dimensional way.

* A4 consumes less power than most of the chips (you can look up for a video comparing ARM cortex and Intel) for processing speeds.
http://www.youtube.com/watch?v=W4W6lVQl3QA&feature=player_embedded (This should explain.)
* NVIDIA is working on their own chip i think its called Tegra. [Actually, VentureBeat has covered the second-generation Tegra 2 chips.]
* when it comes into performance not just chip. even graphics matter (Apple hired bunch of ATI folks to work on.) [See note below]
* Papermaster who replaced Tony Fadel (iPod SVP) is known for small device computing.
* More control for Apple on hardware and its products.
* Less power consumption is proportional to more battery life.
* Apple is not in the chip market it’s in consumer products market, So comparing at the chip level makes no sense, its like comparing ice cream cones. You won’t find major differences between the ice creams. But cones with a rich ice cream does matter (rich ice cream here refers to hardware design, applications, usability, robustness, cool and wow factor and many more.)

Finally to answer your actual question:

* A4 helps iPad with longer battery without sacrificing CPU power.
* A4 + the right graphics chip boosts iPad performance over other chips.”

On the hiring issue, another reader says:

“They hired Michael Frank, who was one of the primary chip architects @ ATI working on the Radeon.  After ATI Michael went to Pixim where he was the chip architect.  Then to AMD where he was a Fellow working on future CPU architectures.  He’s not just working on graphics, but rather overall architecture.”

Other techies preferred to remain anonymous, but were happy to explain:

“With the A4 Apple has taken an ARM CPU core and married it with exactly the IP they need.  They needed a memory controller, a display controller, a 3D graphics engine, a WiFi core, etc. etc. etc.  There are no (well, not exaclty, but for all practical purposes) unnecessary blocks in the A4 chip.

“In contrast, the Intel Moto NVidia options are something more generic. They might have blocks in them that Apple doesn’t need.  These blocks add cost and power without buying any benefit.

On the business side of things, Apple is not beholden to anyone else’s silicon schedule.  The team they have building their chips is very experienced and seasoned.  WHen they craft a schedule that says they will tape out in 18 months, they almost always hit it within weeks of the original date.  Couple this with Samsung as a FAB partner who is willing to run as many experimental wafers as Apple needs and you have the ability to roll your own chip with what you need when you need it.

“Another benefit is cost.  Yes, its costs to have a chip team, but consider that on a gross per chip basis Apple is paying about 1/2 to 1/3 less than a comparable chip (typical markup).  Amortize this across 10 million chips (pulling a number out of my ass for the iPhone 3GS run rate) and you end up with a big chunk of $ saved.  Maybe not enough to fully fund the chip team, but over time it might.  Add ind the power/schedule freedom it affords and you have a very compelling reason.”

And of course, there’s always the guy who answers a question with a question. Here, GPU means the graphics processor unit, a chip such as those made by Nvidia that offloads complicated onscreen graphics from the central processor unit or CPU, such as the Pentiums and Core 2 Duos that Intel is known for. TDP stands for thermal design power, industry jargon for the average maximum power, measured in watts like a light bulb, that a chip can dissipate while running commercially available software.

“I have heard unconfirmed reports that it is a modified A9 Cortex processor with two CPU Cores and some type of GPU core.

“Was wondering if you can confirm that it really is two CPU cores (versus one to save on TDP), and to ask what kind of GPU Apple is using.

“If it is a dual core modified A9, then TDP per CPU Core is about 0.25 Watts. Add in the GPU etc., we are looking at about 0.8 watts total TDP for the application processor.”

Are we sure of that? Another reader says it’s based on the iPhone CPU’s architecture.

“Apple A4’s GPU core is almost certainly based on the PowerVR SGX 5 series from Imagination, possibly based on the new SGX 545 or some 540 series model. There is discussion on whether Apple is using this or a completely new GPU like the ARM Mali, but given that the A4 must run iPhone apps (especially games) in compatibility mode, it doesn’t make sense to deviate from the iPhone class PowerVR SGX hardware that most iPhone games target and risk a different performance profile from a new GPU architecture.”

Finally, for another point of view entirely, one says performance isn’t the issue.

“The biggest point is that if Apple has it’s own chip they control their destiny. Also they make it difficult for anyone to put iphone OS another device. But the Tegra is much better than this. You can see it in the new tegra devices coming out.”

I have to hand it to Apple’s own engineers. They sure know how to shut up about their work. But that leaves me looking to readers again for answers. Is Apple’s A4 really a modified ARM A9 Cortex with two CPU cores and some type of GPU core? Email paul@venturebeat.com if you know more than I do.

Update: A former Apple engineer writes, “A4 was stated to contain an ARM Cortex-A8, not an A9. It was an internal source that told me. I trust them. No confirmation on which SGX, but it is one.”