Discovering innovations that matter since 2002

Smartphone attachments can push smartphones beyond their intended capabilities and transform them into multi-functional devices. For example, a smartphone attachment from South Korea lets users assess the condition of their skin and see any damage. Another example is a smartphone attachment from the Netherlands that can test water quality to identify if it is contaminated. And now, a team of engineers, programmers and designers from the US have designed a tapeless tape measure in the form of a smartphone attachment. Using augmented reality and a small laser emitter, the Arrim ONE can collect measurements without the need for tape.

Both Imperial and Metric Unit Systems are available for taking measurements, and the device can convert between units. Arrim ONE can measure straight lines, angles, circles and curves. It also acts as a leveling tool, a dividing tool, and can perform continuous measurement. Compatible on both iPhone and Android (8.0 or later), users can control the device directly through their smartphone. Within a 20 metre range, the device has a +/-1.5 millimetre precision. Augmented Reality, a patented algorithm, phase-shift laser measurement and virtual 3D coordinate system are some of the technologies that enable the devices accuracy. In addition, the device uses a 100 milliampere hour Lithium-Ion battery which is enough to take 1000 measurements with a single charge of a smartphone.

Currently crowdfunding on Kickstarter, Arrim ONE has already exceeded its pledge goal for the project. Having developed the hardware and software, Arrim ONE’s creators plan to ship the product in September 2018.  It is on sale for USD 79 and has early bird deals available at a lower cost. Smartphones have strong CPU, a display screen and an internet connection. Therefore, they are an opportunity for innovators to take advantage of as they provide a good foundation to build attachable devices for. What other tools can be transformed into smartphone attachments?

As blood circulates through the veins, it passes through a series of one-way valves. If these valves, or the veins, enlarge or swell, it reduces the blood flow. This in turn can lead to a wide variety of conditions, such as varicose veins, lymphedema and deep vein thrombosis. When not properly treated, some types of circulation problems can lead to amputation, pulmonary embolism, stroke and even death. To prevent this, doctors recommend the use of compression devices for people who have circulation problems. One common complaint, however, is that these are uncomfortable and difficult to put on and take off. As a result, many people who need them stop wearing them. Now, Israeli start-up ElastiMed has developed a smart compression device that can help improve circulation in the legs.

ElastiMed’s device is designed for wear throughout the day. It is worn over the leg and rhythmically expands and contracts. This mimics the natural contractions of the calf muscles and helps blood flow from the legs to the heart. ElastiMed’s technology uses electro-active polymers (EAP), a material which expands and contracts when stimulated by an electric field. The device is also IoT compatible. It contains integrated sensors for data collection and compliance monitoring.

Chronic venous diseases affect around an estimated 1 in 1,000 people in the Western world. The market in compression devices to treat these illnesses is about 2.2 billion annually. Furthermore, ElastiMed is betting that there is a big market for treatment solutions that are comfortable and technologically advanced. The company is currently undergoing clinical trials and applying for regulatory approval in Europe and the United States. It hopes to have the devices available for sale by 2019. Medical devices are increasingly incorporating smart technology. We have seen this recently with a smart device that detects hidden bacteria and also an interactive AR imaging for doctors. Will the ElastiMed compression device help reduce the incidence of CVD-related complications?

For smartphone users, apps have streamlined some of life’s most stressful tasks. From helping the visually impaired to assisting with building websites, apps have been revolutionary for technology. The advances in innovation technology has also meant that smartphone users have become so used to turning to their phone to solve any kind of problem. Their capabilities are always expanding with new innovations being unveiled almost every day.

Another field app has proven particularly useful for travel by streamlining the process and making it transparent. Umetrip, a Chinese flight booking and online check in app has taken its own twist on the technology by extending its offering to incorporate a social element. The new update allows users to access information of their fellow fliers. Data such as names, hometowns and star signs are all accessible on the app. Umetrip users can then message fellow passengers directly or create a group chat without their permission. This indeed triggers privacy concerns.

TravelSky Technology, the state-owned aviation company that owns Umetrip, said in a statement that the new app feature meets the demands of its users. Passengers can edit their personal pages to show less personal information, or even turn off the option entirely. Critics argue that revealing the data and the messaging capability being on by default is the problem. Some argue the virtual cabin approach could encourage socialising on long journeys, especially amongst solo travellers. Others argue that they would not want their personal details accessed by strangers, especially when their permission hasn’t been sought. In response, the company has just announced the new update will deactivate the default function. After many companies now complying to GDPR regulations, how would you like your data to be handled?

One problem with synthetic insecticides is that they not only kill pests, they also kill insects that are beneficial, like bees, beetles and butterflies. This, in turn, can have an adverse effect on the biodiversity of entire ecosystems. When water runs off fields and into lakes and streams, it carries the insecticide with it and endangers the life of wildlife living in those areas as well. We have previously seen a number of innovations aiming to farm more sustainably. These innovations include an app to help farmers identify plant diseases and a robot that scares away pests. Now, a team of researchers from the Technical University of Munich (TUM) has developed a biodegradable pesticide that spreads a smell which keeps unwanted insects away.

Professor Thomas Brück and his team in the Department of Industrial Biotechnology were inspired by the tobacco plant, which produces a natural pesticide, called cembratrienol, to protect itself. The researchers first isolated the parts of the tobacco genome responsible for production of the natural repellent. They then inserted these into the genome of E. coli bacteria. The genetically modified bacteria is grown in large vats. The pesticide is then separated out and used as a spray.

Studies by the researchers have shown that the cembratrienol spray can protect against aphids, but is non-toxic to insects. It also does not accumulate in the environment. As an added bonus, the spray appears to have an anti-bacterial effect. This means that it can be used as a non-toxic disinfectant. The researchers feel that cembratrienol could be used against pathogens such as those that cause MRSA, pneumonia pathogen and listeriosis. According to Brück, the spray could be the key to a “fundamental change in crop protection,” by focusing on aggravating pests instead of killing them. Will natural pesticides prove as effective at pest control as synthetic pesticides?

Docademia’s mission is to provide a platform for authentic storytelling by under-represented voices. Founded by Nassim Abdi, a platform that licenses documentary films, provides film for university professors to incorporate into their curriculum, and compensates filmmakers for the use of their films. Each year, the Chicago-based startup hosts an international short film documentary contest. It receives thousands of submissions for general consideration throughout the year. Key to the startup’s success in reaching storytellers is the compensation offered via a two-year licensing deal. Film-makers approve the use of their work, including where, how and by whom.

Abdi explains, “People in this generation understand something better when it’s visual… and that disconnect was something that I really started thinking about”. To grab students attentions, the company provides professors with access to a library of stories that have never been told. Furthermore, they are developing a subscription package available for a monthly fee. Professors access materials at no additional cost. Students and members of the public pay for membership. As well as the films themselves, the curriculum often includes a QandA with the film-maker activist. Educators may also arrange workshops for faculty. Docademia wants to get films into all the social science areas of study. The founders of the company are passionate about critical engagement and discourse and believe in the power of film.

Film effects change in other ways as well. Around 22 people in the US die every day due to a lack of donated organs. To encourage people to donate, a campaign used a feature film to ask the question of how far someone would be willing to go to save a life. Viewers visit the website, enter a code, and hold their mobile phones to their hearts. Once the phone’s accelerometer detects a heartbeat, it brings the film’s actress to life in digital movie posters and a Times Square Billboard. For viewers with limited sight and hearing ability, an American startup created an app that uses technology to amplify sound and provide screen descriptions. How else could the arts encourage positive behavior change and constructive analysis?

Here at Springwise we have recently covered a number of innovations involving sustainable flight. For example, an autonomous drone taxi and airplane biofuel made from old clothes. Now, a Swiss company SolarStratos is poised to make aviation history with a solar-power aircraft. The SolarStratos aims to be the first solar-powered plane to reach the stratosphere, 25,000 metres high. The project has been underway for four years and is expected to attempt a manned flight to 10,000 meters this fall. Further to this, with a flight to the stratosphere later in the year. The flight will take around two and a half hours to reach 25,000 meters. There, on the edge of space, pilot and SolarStratos founder Raphaël Domjan will spend 15 minutes in the stratosphere before slowly spiralling back down to Earth.

The SolarStratos plane has been developed using a combination of off-the-shelf and custom parts. Austrian battery firm Kreisel Electric has developed an experimental 20 kilowatt hour lithium-ion battery that can operate safely in the extremely low temperatures of the stratosphere. California-based SunPower is providing the solar cells. These will power a small electrical engine and charge a 20 kilowatt hour lithium-ion battery. Additionally, the extreme cold and lack of oxygen in the stratosphere means that the pilot will need to wear a pressurised spacesuit. Russian spaceflight specialist Zvezda has developed a specially adapted, lightweight suit which is being donated to the project.

Domjan’s goal is to not just push limits, but to draw attention to the need to tackle climate change. He hopes that taking a solar-powered plane to the edge of space will send the message that clean technology has a vast potential. Once the technology has been proven, SolarStratos plans to build a three-person version. This will include a pressurised cabin and will operate commercial space tourism flights. The company hopes to have these running by 2021, with prices possibly starting at around 60,000 USD. Will SolarStratos help to convince people of the potential of solar power?

Artificial Intelligence (AI) has been touted as a way to improve almost every aspect of life. There are already AI systems that can compose songs, monitor construction projects and detect heart attacks. What exactly is AI, and will it turn the world into a Terminator-style dystopia, or create an Eden where computers do all the work?

Put simply, AI is a term used to describe computer software that can learn from experience. This is very different from most software, which can only perform tasks it has been pre-programmed to execute. One way to see how AI is different is to think about the way humans recognise a dog. There are hundreds of different breeds of dog in the world, of all different colours, sizes and shapes. Yet humans, even very young children, can instantly recognise any of them as a dog. To teach a computer to do this it is necessary to first show it millions of photos of dogs. It must then be taught how to compare any new image it receives to the photos, and draw a conclusion as to whether the new image is also a dog. Achieving this requires two things – a lot of very large data sets and a lot of processing power to analyse the data. It is only recently that these two things have become readily available.

One of the most powerful types of AI software is called deep neural learning. In deep learning, many different algorithms (small software programmes) are connected together in layers. Each algorithm looks at a different part of the problem, and also weights the calculations made by the earlier layers. For example, in the problem of how to recognise a dog, the first layer might focus just on the outline, another on colour, and yet another on size. Every layer weights the input from the previous layers and then moves all the information on to the next layer. The result (dog or no dog) is based on calculations from the collective weights of all the layers. In this way, the software has taught itself how to recognise a dog.

For now, most AI can only preform a clearly defined task. For example, an AI algorithm that has learned how to play poker cannot also learn to play chess. The next step is to develop an AI system that can solve new problems or teach itself new tasks by applying information it already knows. This type of AI is called artificial general intelligence (AGI). Some progress towards AGI has already been made. Researchers at DeepMind have created an algorithm that has learned how to play both Go and Chess. The final step would be an AI that is actually conscious, that is self-aware. Will AI one day be able to do everything humans can do?

Four years ago we covered the OrCam headset, a device that helps the visually impaired. The OrCam mounts onto any pair of glasses and can detect things in a wearers field of view. The wearer points at objects, text, or people they want to identify and the OrCam provides the information in audio form using bone conduction. Now, the company has introduced the OrCam MyEye 2.0 – a vast improvement on the original, ground-breaking device.

The original device was the size of a smartphone and used a separate head unit and base unit, connected with a wire. The new version is compact and wireless. The new MyEye has been reduced to about a tenth of the size and weight of the original and is entirely self-contained. It also incorporates improved AI-powered computer vision and machine learning. This enables it to read computer screens and printed text, such as books, product labels, and menus. It can also identify currency and can translate text into different languages. The device responds to intuitive gestures, so users simply need point to the piece of text they want read, and hold out their hand to stop the audio. The text-to-speech is available in 15 languages.

MyEye can also now scan barcodes to recognise products. It comes with a preloaded database containing hundreds of thousands of barcodes. Other unique functions include colour identification and telling the time when users lift their wrists as though looking at a watch. It also incorporates a facial recognition algorithm that can be trained to recognise people standing in front of the wearer. The MyEye 2.0 sells for around USD 3,500, around the price of a mid-range hearing aid. The glasses-mounted wearable competes in a market slowly dominated by a range of cheap and free app such Microsoft’s Seeing AI app, which also translates the visual world into an audible experiences. According to users, MyEye 2.0 is ahead of the game with the speed of its new AI engine, which does all the work and take less than 30 seconds to programme in a face.

It joins a range of innovative devices aimed at helping those with disabilities, including a system that translates between sign language and English and a navigation app to help those with blindness. With software and hardware moving at a very rapid pace, will OrCam be able to create a sustainable and long-lasting solution?

Plastic waste from retail packaging is a big problem. Discarded containers are finding their way into islands of ocean plastic and causing havoc to marine wildlife and beyond. We’re seeing attempts to clean up our oceans and to recycle the plastic into functional, durable products. Lush, however, is tackling the problem at the source by targeting one of the main issues of plastic waste: cosmetic packaging.

The UK-based cosmetics retailer has developed an image recognition app as part of it’s ‘naked’ initiative to reduce plastic use. Called Lush Lens, the app is a key element of Lush’s new brick-and-mortar outlet store in Milan. This store will feature entirely packaging free products. Lush’s R&D team have produced a range of solid shampoos and other cosmetics, including a sea turtle bath bomb that releases agar ‘plastic’ into the bathwater to raise awareness of how plastic waste affects sea turtle habitat.

These cosmetics were specifically designed to remove the need for packaging. However, when packaging is removed, a crucial source of product information is lost. This is where the app, also developed by the R&D team, comes in. The store has partnered with Fairphone to provide in-store smart devices loaded with the app. Customers can point the device’s camera at the product they’re interested in. The software will recognise the product and display an AR (augmented reality) information page about the product’s ingredients. Lush Lens is powered by Tensorflow, Google’s powerful open source machine learning api. Following the first store in Milan, Lush plans to launch Lush Lens globally, plus a range of other ‘naked’ initiatives.

Integrated Roadways introduces pre-cast, connected pavement slabs that turn any road into a real information highway. The Kansas City-based technology startup created Smart Pavement for use in future smart roads. The company’s vision is a hard-working highway system that manages a variety of tasks while vehicles are in motion. The roads charge electric vehicles, connect passengers to high speed internet and alert emergency services immediately after an incident. Vehicles that slide off the road no longer have to wait for a passerby to phone in the accident. The sensors in the pavement record tire speed and location, thus immediately identifying when a vehicle leaves the road.

Pre-cast slabs greatly reduce construction and installation costs and integrates technology from the start. Repairs and upgrades are much quicker and less expensive to complete thanks to the pavement’s modularity. A router on each side of a slab connects it to its neighbors, making current analogue roads smart. The roads use the connections to relay traffic news and route suggestions to drivers and autonomous vehicles. Integrated Roadways is now beginning work on a pilot project with the Colorado Department of Transportation. The test will comprise half a mile of smart road and will run for five years.

In Sweden, the world’s first electrified road charges vehicles as they drive. A rail connected to the power grid is built in the road, transferring energy to the vehicle moving above it. It does so with a movable arm that attaches underneath the vehicle to detect when there is a rail in the road. It’s not just vehicular traffic that is getting a sustainable upgrade. A bike path in Poland is charged by the sun and glows blue at night to illuminate the way for riders. Needing 30 to 60 minutes of daylight to fully charge, the phosphorescent path glows for up to eight hours. The team behind the design expects it to work for up to 20 years. What would help cities combine smart pedestrian sidewalks with electrified, smart roads and integrated cycle ways?