All Apple news

Manufacturer Imagination Technologies GPU will lose all of Apple’s contributions after 2 years

In the beginning of the month, Imagination Technologies, the largest developer of licensable GPU and graphics solutions which are used in the iPhone and iPad, reported that less than two years, Apple will completely abandon its licensing developments. In Cupertino have informed the company about the development of “its own graphics architecture, which in the future will reduce the dependence on technologies of Imagination”.

As reported by analysts at UBS, the British company first receives from Apple, only third part of current license fees, and two years later the payments completely stopped. Without the cooperation with the “Apple” giant Imagination Technologies will become unprofitable by 2019 financial year, bringing the European chip maker will have to consider various options to reduce costs, experts say.

Apple has notified Imagination Technologies about the end use of British technology company in new products for 15-24 months. The result is the market value of Imagination Technologies per day fell by nearly a third.

Currently, Imagination Technologies leads with Apple negotiating a new licensing deal. According to rumors, the Cupertino company is developing its own graphics processor, so further cooperation with Imagination or with any GPU on the market less likely.

Most likely after a generation (in 2018) we are waiting for the iPhone built on the Apple AX, where the company designed not only CPU but also GPU. Taking into account considerable experience and success in developing their own mobile processors (last years SoC Apple are the best on the market in performance) Apple’s plans to develop more and GPU are of particular interest.

Read also:   Apple has started selling the restored, 12.9-inch iPad Pro is discounted to $160

But the slightest aspects of the new graphics architecture will be subject to scrutiny, because Imagination is highly sceptical that Apple will be able to develop their own GPU architecture from scratch, without violating its patents.

Leave a Reply

Your email address will not be published. Required fields are marked *