THE FUTURE IS HERE NOW

 

SpeedDream is a story that I have been following for a while, I have posted about it earlier. The next step in Monohull sailing.   The America’s Cup has just finished a week of racing in San Francisco and today three of the teams will unveil their respective 72 foot boats; which will take the America’s cup to a whole new level.

Cloud Computing, a phrase that is easily thrown around, has been around for along time already, but still in it’s infancy. Amazon is one of the big suppliers. All of this is changing the face of how business is conducted.I wonder what the future consequences will be; when one does not really have control over one’s own material.It is truly in someone else’s hands.Or, someone else’s server.

I am my age, I am used to the idea that in order to sell something you must have a product to exchange for money. Clearly I am old fashioned. I find it harder very day to justify my existence on earth. My ideas do not find as easily as they once did.

I suppose it is one of the reasons I like long distance sailing. Life is reduced to very elemental levels. Perhaps it reminds us who or what we really are. I have very little desire to go into space, I find the world has much to much to offer. I would like to see the earth from space, however.

WILL THIS WORK AT SEA?

NEW TECHNOLOGY
A new sensor system from the Fraunhofer Institute for Telecommunications,
Heinrich Hertz Institute can help to detect weak points on time and warn
yachtsmen when breaking point has been reached. Prof. Wolfgang Schade and
his team in the Project Group for Fiber Optical Sensor Systems in the
German town of Goslar have developed “nerves of glass” which can measure
the forces that act on hulls, masts, and sails.

Schade and his team’s next objective is to adapt the measurement technology
so it is fit for use in competitive racing. “We have now fitted sail
battens with fiber optic sensors, which will help competitors in future to
find the optimal trim…” explains Schade. For the first time, the fiber
optic sensors and the connected measuring equipment – which is no bigger
than a cigarette packet and contains an LED light source, spectrometer, and
electronics – are supplying reproducible values.

This data tells the crew in which areas there is too much or too little
pressure, or how stresses shift to different areas, for example when the
sheets are pulled in tighter. The results provided by the sensor technology
will be accessible everywhere on board at all times – Schade’s team has
already developed an app that allows crew members to access real time data
from their smart phones. The new measuring system will be launched shortly
under the name NextSailSystem.

DATA WE DREAMED ABOUT.

For me this is a dream come true. To have the data instantly available and unobstructed by crew.This is cool.

People who constantly reach into a pocket to check a smartphone for bits of information will soon have another option: a pair of Google-made glasses that will be able to stream information to the wearer’s eyeballs in real time.

According to several Google employees familiar with the project who asked not to be named, the glasses will go on sale to the public by the end of the year. These people said they are expected “to cost around the price of current smartphones,” or $250 to $600.

The people familiar with the Google glasses said they would be Android-based, and will include a small screen that will sit a few inches from someone’s eye. They will also have a 3G or 4G data connection and a number of sensors including motion and GPS.

A Google spokesman declined to comment on the project.

Seth Weintraub, a blogger for 9 to 5 Google, who first wrote about theglasses project in December, and then discovered more information about them this month, also said the glasses would be Android-based and cited a source that described their look as that of a pair of Oakley Thumps.

They will also have a unique navigation system. “The navigation system currently used is a head tilting to scroll and click,” Mr. Weintraub wrote this month. “We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.”

The glasses will have a low-resolution built-in camera that will be able to monitor the world in real time and overlay information about locations, surrounding buildings and friends who might be nearby, according to the Google employees. The glasses are not designed to be worn constantly — although Google expects some of the nerdiest users will wear them a lot — but will be more like smartphones, used when needed.

Internally, the Google X team has been actively discussing the privacy implications of the glasses and the company wants to ensure that people know if they are being recorded by someone wearing a pair of glasses with a built-in camera.

The project is currently being built in the Google X offices, a secretive laboratory near Google’s main campus that is charged with working on robots, space elevators and dozens of other futuristic projects.

One of the key people involved with the glasses is Steve Lee, a Google engineer and creator of the Google mapping software, Latitude. As a result of Mr. Lee’s involvement, location information will be paramount in the first version released to the public, several people who have seen the glasses said. The other key leader on the glasses project is Sergey Brin, Google’s co-founder, who is currently spending most of his time in the Google X labs.

One Google employee said the glasses would tap into a number of Google software products that are currently available and in use today, but will display the information in an augmented reality view, rather than as a Web browser page like those that people see on smartphones.

The glasses will send data to the cloud and then use things like Google Latitude to share location, Google Goggles to search images and figure out what is being looked at, and Google Maps to show other things nearby, the Google employee said. “You will be able to check in to locations with your friends through the glasses,” they added.

Everyone I spoke with who was familiar with the project repeatedly said that Google was not thinking about potential business models with the new glasses. Instead, they said, Google sees the project as an experiment that anyone will be able to join. If consumers take to the glasses when they are released later this year, then Google will explore possible revenue streams.

As I noted in a Disruptions column last year, Apple engineers are also exploring wearable computing, but the company is taking a different route, focusing on computers that strap around someone’s wrist.

Last week The San Jose Mercury News discovered plans by Google to build a $120 million electronics testing facility that will be involved in testing “precision optical technology.”