While Google is focused on synthetic comprehension and making Google Assistant smarter and some-more conversational, a association needs estimate horsepower to make it happen. Enter: TPU 3.0.
Google CEO Sundar Pichai summarized TPU 3.0, brief for a third chronicle of a Tensor Processing Unit. TPUs are Google’s tradition focus specific processor to accelerate appurtenance training and indication training.
These TPUs hoop TensorFlow workloads, that are used by researchers, developers, and businesses. TPU 3.0 will be essentially consumed by Google Cloud.
Pichai didn’t offer adult a ton of fact about TPU 3.0 pod other than to contend they are eight-times some-more absolute than a TPU 2.0 pod announced a year ago.
“These chips are so absolute that for a initial time we had to deliver glass cooling to the information centers,” pronounced Pichai. TPU 3.0 can hoop adult to 100 petaflops of appurtenance training compute.
As for use cases, TPUs are used by Google to rise models with vast collection sizes.
Pichai summarized Google Assistant improvements as good as some pivotal investigate areas such as medical and presaging outcomes. One early instance of Google AI revolved around presaging health risks form retina scans.