Last week, Kathryn Zealand shared some insight on the eve of Women’s Equality Day. The post highlighted an issue that’s been apparent to everyone in and around the robotics industry: there’s a massive gender gap. It’s something we try to be mindful of, particularly when programming events like TC Sessions: Robotics. Zealand cites some pretty staggering figures in the piece.
According to the stats, around 9% of robotics engineers are female. That’s bad. That’s, like, bad even by the standards of STEM fields in general — which is to say, it’s really, really bad. (The ethnic disparities in the same source are worth drawing attention to, as well.)
Zealand’s piece was published on LinkedIn — fitting, given that the overarching focus here is on hiring. Well worth your time, if you’re involved in the hiring process at a robotics firm and are concerned about broader diversity issues (which hopefully go hand in hand for most orgs). Zealand offers some outside of the box thinking in terms of what, precisely, it means to be a roboticist, writing:
We have a huge opportunity here! Women and other under-represented groups are untapped pools of talented people who, despite not thinking of themselves as “roboticists,” could be vital members of a world-changing robotics team.
I’m going to be real with you for a minute, and note what really caught my eye was that above image. See, Zealand is a Project Lead at Alphabet X. And what you have there is a robotic brace — or, rather, what appears to be a component of a soft exosuit.
Exosuits/exoskeletons are a booming category for robotics right now that really run the gamut from Sarcos’ giant James Cameron-esque suit to far subtler, fabric-based systems. Some key names in the space include Ekso Bionics, ReWalk and SuitX. Heck, even Samsung has shown off a solution as part of a robotics department that appears to be largely ornamental at the moment.
Most of these systems aim to tackle one of two issues: 1) Augmenting workers to assist with difficult or repetitive tasks for work and 2) Provide assistance to those with impaired mobility. Many companies have offers for both. Here’s what Harvard’s Biodesign Lab has to say on the matter:
As compared to a traditional exoskeleton, these systems have several advantages: the wearer’s joints are unconstrained by external rigid structures, and the worn part of the suit is extremely light. These properties minimize the suit’s unintentional interference with the body’s natural biomechanics and allow for more synergistic interaction with the wearer.
Alphabet loves to give the occasional behind-the-scenes peak at some of its X projects, and it turns out we’ve had a couple of glimpses of the Smarty Pants project. Zealand and Smarty Pants make a cameo in a Wired UK piece that ran early last year about the 10th anniversary of Google/Alphabet X. The piece notes that that the project was inspired by her experience with her 92-year-old grandmother’s mobility issues.
The piece highlights a very early Raspberry Pi-controlled setup created by a team that includes costume designers and deep learning specialists (getting back to that earlier discussion about outside the box thinking when it comes to what constitutes a roboticist). The system is using sensors in an attempt to effectively predict movement in order to anticipate where force needs to be applied for tasks like walking up stairs. The piece ends on a fittingly somber note, “Fewer than half of X’s investigations become Projects. By the time this story is published it will probably have been killed.”
My suspicion is that the team is looking to differentiate itself from other exosuit projects by leveraging Google’s knowledge base of deep learning and AI to build out those predictive algorithms.
Alphabet declined to offer additional information on the project, noting that it likes to give its moonshot teams, “time to learn and iterate out of the spotlight.” But last October, we got what is probably our best look at Smarty Pants, in the form of a video highlighting Design Kitchen, Alphabet X’s lab/design studio.
The Wired piece mentions a “pearlescent bumbag,” holding the aforementioned Raspberry Pi and additional components. For you yanks, that’s a fanny pack, which are not referred to as such in the U.K., owing to certain regional slang. Said fanny pack also makes an appearance in the video, providing, honestly, a very clever solution to the issue of hanging wires for an early-stage wearable prototype.
“One of the things that’s really helped the team is being really focused on a problem. Even if you spent months on something, if it’s not actually going to achieve that goal, then sometimes you honor the work that’s been done and say, ‘we’ve learned a ton of things during the process, but this is not the one that’s actually going to solve that problem.’ ”
The most notable takeaway from the video is some additional footage of prototypes. One imagines that, by the time Alphabet feels confident sharing that sort of stuff with the world, the team has moved well beyond it. “It doesn’t matter how janky and cardboard-and-duct-tape it is, as long as it helps you learn — and everyone can prototype, even while working from home,” the X team writes in an associated blog post.
The one other bit of information we have at the moment is a granted patent application from last year, which comes with all of the standard patent warnings. Seeing a patent come to fruition is often even more of a longshot (read: moonshot) than betting on an Alphabet X project to graduate. But they can offer some insight into where a team is headed — or at least some of the avenues it has considered.
The patent highlights similar attempts to anticipate movement as those highlighted above. It effectively uses sensors and machine learning to adjust the tension on regions of the garments designed to assist the wearer.
The proposed methods and systems provide adaptive support and assistance to users by performing intelligent dynamic adjustment of tension and stiffness in specific areas of fabric or by applying forces to non-stretch elements within a garment that is comfortable enough to be suitable for frequent, everyday usage. The methods include detecting movement of a particular part of a user’s body enclosed within the garment, determining an activity classification for that movement, identifying a support configuration for the garment tailored to the activity classification, and dynamically adjusting a tension and/or a stiffness of one or more controllable regions of the garment or applying force to non-stretch fabric elements in the garment to provide customized support and assistance for the user and the activity the user is performing.
It’s nice seeing Alphabet take a more organic approach to developing robotics startups in-house, rather than the acquisitions and consolidations that occurred several years back that ultimately found Boston Dynamics briefly living under the Google umbrella. Of course, we saw the recent graduation of the Wendy Tan White-led Intrinsic, which builds software for industrial robotics.
All right, so there’s a whole bunch of words about a project we know next to nothing about! Gotta love the startup space, where we’re definitely not spinning wild speculation based on a thin trail of breadcrumbs.
I will say for sure that I definitely know more about Agility Robotics than I did this time last week, after speaking with the Oregon-based company’s CEO and CTO. The conversation was ostensibly about a new video the team released showcasing Digit doing some menial tasks in a warehouse/fulfillment setting.
Some key things I learned:
Agility sold a dozen Cassie robots, largely to researchers.
It’s already sold “substantially more” Digits.
The team includes 56 people, primarily in Oregon (makes sense, as an OSU spinout), with plans to expand operations into Pittsburgh, everyone’s favorite rustbelt robotics hub.
Agility is consulting with “major logistics companies.”
In addition to the Ford delivery deal, the company has its sights set on warehouse tasks in hopes of offering a more adaptable solution than ground-up warehouse automation companies like Berkshire Gray.
Oh, and a good quote about job loss from CEO Damion Shelton:
The conversation around automation has shifted a bit. It’s viewed as an enabling technology to allow you to keep the workforce that you have. There are a lot of conversations around the risks of automation and job loss, but the job loss is actually occurring now, in advance of the automated solutions.
Agility hopes to start rolling out its robots to locations in the next year. More immediate than that, however, is this deal between Simbe Robotics and midwestern grocery chain, Schnuks. The food giant will be bringing Simbe’s inventor robots to all of its 111 stores, four years after it began piloting the tech.
Simbe says its Tally robot can reduce out of stock items by 20-30% and detect 14x more missing inventor than standard human scanning.
Carbon Robotics (not to be confused with the prosthetic company of the same name that made it onto our Hardware Battlefield a few years back) just raised $27 million. The Series B brings its total funding to around $36 million. The Seattle-based firm builds autonomous robots that zap weeds with lasers. We highlighted their most recent robot in this column back in April.
And seeing how we recently updated you on iRobot’s continued indefinite delay for the Terra, here’s a new robotic mower from Segway-Ninebot.
Segway’s first robotic lawnmower is designed for a lawn area of up to 3,000 square meters, has several features of a smart helper in the garden and is the quietest mower on the market with only 54 dB. The Frequent Soft Cut System (FSCS) ensures that the lawn is cut from above and the desired height is reached gradually. Offset blades allow cutting as close as possible to edges and corners.
That’s it for the week. Don’t forget to sign up to get the upcoming free newsletter version of Actuator delivered to your inbox.