The Mechanical Silent Spring

Oliver Mitchell
5 min readFeb 12, 2017

Trump’s Chinese hoax had quite a week with temperatures plummeting over 40 degrees overnight followed by a foot of snow. I write this post buried in a blizzard, as the wind is whipping around my windows. The rattling of glass is a reminder of how often we humans ignore the reality of science. Thankfully, roboticists are hard at work on Plan B.

Last month, one of President Obama’s final executive orders was adding the rusty-patched bumblebee to the endangered species list. Bumblebees pollinate close to one-third of our food supply. However, since the 1990s the population of honey bees has declined by nearly 90% due to pesticides, climate change, and habitat loss and disease. Compounding the threat is Colony Collapse Disorder (CCD), the phenomenon where worker bees abandon their hives, leaving behind only the queen bee with enough food to take care of her young. Around the world, the decline of bee populations is alarming, threatening hundreds of billions of dollars worth of food.

“Pollinators are small but mighty parts of the natural mechanism that sustains us and our world,” said Tom Melius of the US the Fish and Wildlife Service.

At Japan’s National Institute of Advanced Industrial Science and Technology, Eijiro Miyako his colleagues have showcased a drone bee that is capable of cross-pollinating flowers. The key is its size and soft, flexible animal hairs on its belly that were able to extract the pollen without damaging the stamens or pistils. The micro-drone that measures 4 centimeters is covered with a sticky-gel and horsehair enabling pollen to adhere to it and rub off on the next flower.

“We hope this will help to counter the problem of bee declines,” says Miyako. “But importantly, bees and drones should be used together.”

Miyako’s team is working on developing the next generation of drone bees that could create a hive of autonomous drones that could help farmers to pollinate their crops. GPS, high-resolution cameras and artificial intelligence are being utilized in the drone bee’s firmware to independently plot its path between flowers.

Saul Cunningham at the Australian National University says that using drones to pollinate flowers is an intriguing idea but may not be economically feasible. “If you think about the almond industry, for example, you have orchards that stretch for kilometers and each individual tree can support 50,000 flowers,” he says. “So the scale on which you would have to operate your robotic pollinators is mind-boggling.”

As reported previously, Harvard University researchers have been working on RoboBees since 2012 led by professor Robert Wood. Their super lightweight bees could soon be deployed on farms within the next 5–10 years. Although Wood said that CCD and the threat it poses to agriculture were part of the original inspiration for creating a robotic bee, the devices aren’t meant to replace natural pollinators forever. He suggests that his RoboBees would serve as a “stopgap measure while a solution to CCD is implemented.”

The concept of using robots to solve man-made problems is not novel, at the Chonnam National University in Korea synthetic plants are being developed to counter deforestation. The robotic plant stands 4 ¼ feet tall and measures almost 16 inches in diameter. The mechanical photosynthetic eukaryotes is designed to react to different stimuli and even bends its stem with buds blooming at visitors. Aside from physically reacting to stimuli, the plant’s main purpose is to emit oxygen and moisture like its real-life counterparts.

Project leader Park John-oh suggests, “building a robot garden from his creation — sure, we’ll get right on that just as soon as we get our Doomsday Machine back on-line.”

Rachel Carson’s image of a Silent Spring has started to come into focus with robotic bees flying above mechanical plants. Another consequence of the hole in the Ozone Layer is rising skin cancer rates. While robots have yet to patch the atmosphere, AI-enabled software is aiding doctors with better tools to diagnose cases earlier.

Medical researchers at Stanford University are using a neural network algorithm, originally developed by Google, to diagnose skin cancer with a 95% accuracy. The team trained the network by teaching it to sort images and recognize patterns. Lead researcher, Andre Esteva, explains the first steps of training data, “we taught it with cats and dogs and tables and chairs and all sorts of normal everyday objects. We used a massive data set of well over a million images.”

Esteva’s team then narrowed its inputs with 129,450 demonology images representing over 2,000 skin diseases gathered from 18 online galleries curated by the Stanford University Medical Center. According to the early test results, the neural network did just as well, and sometimes better, than board-certified dermatologists. For example, the algorithm was able to classify 96% of malignant growths and 90% of benign lesions compared to his human counterpart that identified 95% of malignancies and 76% of benign lesions.

“The aim is absolutely not to replace doctors nor to replace diagnosis,” says Esteva. “What we are replicating [is] sort of the first two initial screenings that a dermatologist might perform.”

Long-term Esteva envisions empowering patients with the ability to self diagnose lesions via their mobile device. As he states, “everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”

However, some scientists are skeptical. Computational biologist, Evelina Gabasova at the University of Cambridge, explains that Esteva is a long way off from enabling non-medical persons from diagnosing cancer. While the neural network may be proficient at recognizing high quality images (curated by doctors), there is a huge order of magnitude in analyzing snapshots from badly lit cell phone photos. In her words, “the caveat is that, at the moment, [the software] is trained on clinical images, which may have different lighting but still have similar quality.”

The bee, the plant, and the doc-in-a-pocket are novel ways to counter the effects of global warming. Humankind is becoming increasingly more comfortable with delegating its earthly responsibilities to machines, filling the void of responsible climate policy. We will be discussing this and more, including biomimicry robots like bees (see ant examples above) at our next event about The Societal Impact of Robots on March 2nd in New York City.



Oliver Mitchell

Oliver Mitchell is a partner at ff Venture Capital. His area of focus is drones, robotics, and applied AI. Oliver is also an adjunct professor at YU.