Beam should have a hardware API

We've got a few Beam telepresence robots at USV, and use them all the time.  Fred has written about them here.  We had a team meeting today, and we had two beams going at once -- Fred and I were the first to arrive, and we were chatting beam-to-beam -- he in LAUtah, me in Boston, both of us in NYC by robot:

Talking beam-to-beam w @fredwilson - him in LA, me in Boston, talking in the USV office in NYC

— Nick Grossman (@nickgrossman) January 13, 2016

It works amazingly well.  It has now become somewhat normal for robots to be roving around the office, having conversations w people, USV team folks and visitors alike. One idea that keeps coming up is an extensible peripherals API -- the Beam robots already come w a USB port (used for initial setup), and it should be possible to use that to extend it with hardware.  We joke about jousting (and have done some), but I could seriously imagine bolting on devices such as additional displays / LCDs, sensors of various kinds, devices that can perform human-like gestures (the way the Kubi can nod, shake and bow), etc. Thinking of Beam as a platform in this way would certainly extend its capabilities (in particular for industry), and would also position Beam in a much stronger position at the center of an ecosystem.  Would love to see that happen.

Collect this post to permanently own it.
The Slow Hunch by Nick Grossman logo
Subscribe to The Slow Hunch by Nick Grossman and never miss a post.
  • Loading comments...