American technology giant, Google, recently announced the launch of the second generation of Google Buds in the United Kingdom, Canada, France, Australia, Germany, Ireland, Italy, Singapore and Spain on 13 July. This launch was previously an American exclusivity.
One of the most heavily promoted features of these wireless earphones is their ability to translate dozens of languages in real time using Google Translate, but it is hard not to consider this function as more than a gadget at the moment.
Different accents may make languages difficult to translate
Using a similar voice technology recognition as the Google Assistant, the conversation mode can theoretically instantly translate what someone says into another language, allowing two people to chat regardless of their respective language.
However, Google Translate, which is used to produce the sentences in a different language, is still not fully efficient. The text version struggles with longer sentences, proper nouns (such as brands, names or locations) and recognizing mistakes in the source material.
On top of these problems that can lead to poor translations, the Google Buds version of the translation app is likely to have issues with accents and fast speeches. This makes its use potentially cumbersome in slightly faster paced environments such as restaurants, which is one of the main uses suggested by Google.
As such, it is fairly unlikely that this technology will create a connected world where everybody understands each other anytime soon.
Less common languages may be a problem
Although not being as revolutionary as Google claims they are, these earbuds could still be of use in the field of tourism as Google Translate fares well with short sentences. As such, provided they receive training and adapt to the technology’s shortcomings, professionals such as tourist information bureau workers could benefit from this innovation.
Some tourists, including Americans and Brits, are infamous for not speaking other languages when on holiday and this tool could potentially improve their communication skills when they are holidaying internationally.
Similarly, these earphones could also allow for easier exchanges in destinations that speak a language which is not commonly taught around the globe. Although, the fact that it is machine learning based means that it will only be as good as how it has been trained. Therefore, it might struggle to translate less common language pairs (e.g. Swedish – Thai).
In other words, users should always keep in mind that the technology is in its infancy and that it is still far from being fully functional.
Verdict deals analysis methodology
This analysis considers only announced and completed artificial intelligence deals from the GlobalData financial deals database and excludes all terminated and rumoured deals. Country and industry are defined according to the headquarters and dominant industry of the target firm. The term ‘acquisition’ refers to both completed deals and those in the bidding stage.
GlobalData tracks real-time data concerning all merger and acquisition, private equity/venture capital and asset transaction activity around the world from thousands of company websites and other reliable sources.
More in-depth reports and analysis on all reported deals are available for subscribers to GlobalData’s deals database.