What does the world think of Cozmo?

With Cozmo, it was love at first sight. SO, when the assignment for reviewing a toy was given to us, I could not think of going with anything else but Cozmo <3.

Cozmo is a very popular toy that had massive publicity when it launched in 2016. It was covered by most tech publications (Link 1, Link 2, Link 3) and people (me included!) have been going ga-ga about it. But does the toy hold up 2 years later? Let’s find out!

First stop, the cozmo website!

Cozmo is marketed as a toy with brains and personality. The home page (Link) talks about cozmo as a an accomplice that fits into your home and the first page has multiple videos of what Cozmo can do. It’s interesting to observe that the videos focus on Cozmo interacting with and doing things instead of its design and looks. Clearly, the company is confident in Cozmo’s personality as the driver for sales.

Screenshot 2019-02-22 10.17.30.png
Screenshot 2019-02-22 10.23.45.png

The second tab on the homepage is life with Cozmo which is an interesting choice. The tech tab is 4th in the navigation menu which usually is not the case. Most companies love to show of the tech first but the makers of Cozmo decide to stick with what it can do. *Applause*

One of the interesting features of animal detection is hidden inside the tech section. They could have brought it up front to convince families that Cozmo would fit into a family perfectly.

Screenshot 2019-02-22 10.26.59.png

So, brains, personality, smarts and fun engagement for the whole family (animals included!) Does Cozmo hold up to promises? let’s find out.

My first stop is amazon where it has a great rating of 4.4! Link

Screenshot 2019-02-22 10.34.39.png

Reading through the positive and negative comments, The observations are:

  • A lot of the negative issues are because of technical glitches, unresponsive support and quality control.

  • A few parents had an issue of how it was tethered to a mobile phone and they did not want their kids to be glued to a smartphone. This issue is going to be germane to a lot of smart devices in the future. How can we build stuff which does not need the mobile phone as a driver?

  • While most people love the personality of the toy, it gets repetitive after a while. How can developers keep building a personality of a device over extended period of times is a challenge and it will need to be addressed by the creators.

  • Cozmo has a high plateau of engagement. Some kids drop off very early and don’t see the point in the hassle of setting it up and playing with it. Whereas, the ones who have praised it seem to have stuck with it for a while.

So, clearly the personality is a hit but the intelligence seems to be quite basic. So i started to look at people who have been hacking into Cozmo and using it as a platform. And guess what! There is an ocean of such content!

Youtube is full of Cozmo hack videos Link

Someone made their own Link

The sub-reddit is extremely active with videos, support help, hacks, mods and what not! Link

In my observation, Cozmo has managed to create a small following of people who are invested in the platform but my feeling is that it has the same issues as the Kinect. People don’t see the value upfront as the out-of-the-box execution doesn’t hold up to the promises made but enthusiasts love it for the flexibility and extensibility it offers. I wonder what direction cozmo is going to take from here.

The biggest take-away for me after delving deep into the world of smart toys is:

  • With buzzwords like intelligent, smart and AI thrown around, are we setting high expectations from a toy which then fails to live up to the hype? How can we set realistic expectations?

  • Are we diluting what we mean by smartness? What is so smart about Cozmo as most of its behavior feels programmed rather than emergent?

  • How can the behavior of an AI toy feel more organic across time?

  • How can we design the out of the box experience in a way that connects with an impatient kid so that they does not give up on it within a few hours?

  • The creation of a personality is paramount to a smart device. I wonder how the designers, engineers and product development worked to create Cozmo’s unique personality. It’s a great case study and having worked on large teams, I can understand how hard it is to pull something like this off. The tight integration between multi-disciplinary teams is something that I would like to understand more of.

And while I chew on these questions, Here are a few lovely videos of Cozmo with animals


For this week’s assignment, we had to make a web dashboard for controlling a Philips Hue bulb. I was pretty stretched for time this week so I decided to keep things simple and learn the basics. Going through the tutorial was pretty self-explanatory and the code on github was pretty logical. However, I tripped up with the call-backs and it was quite confusing for me to understand. Thankfully, I read through Timothy and Atharva’s blog and their call-backs made sense to me. I created a basic UI where the Hue can be controlled by changing it’s Hue, Saturation and Brightness values through sliders and also gave the users the option to turn it on and off. I was also changing the background of the webpage to match the color of the Hue bulb.

Screenshot 2019-02-12 04.26.39.png

I was looking at changing the color of the text to a complementary color based on the currently selected value but did not find any easy way to convert HSB values to its complementary colors. Also, I had thought of an ambient mode where the digits of HH:MM:SS are converted into a RGB hex value which is then transmitted to the hue. Again, I tripped up because I could not find a reliable way to do this. Passing RGB to HSL did not match the colors. So my top questions for this week would be:

1) How do you calculate the complementary HSB value of a color through code?

2) How do you convert a rgb color to a HSB color?

Currently listening: A whiter shade of pale - Procol Harum

References used: 



//IP address of Philips Hue
let IPHub = '';
let userName = 'Your user name goes here'; // My user name as per the hue developer API
let url;

let gotham;

//Variables for display of controls and labels
let checkBox;
let hueSlide;
let hueText;
let satSlide;
let satText;
let brightSlide;
let brightText;

//Variables for controlling the philips bulb and its color
let lightNum = 1;
let hueVal = 32767;
let satVal = 127;
let brightVal = 127;

//Loading fonts
function preload() {
  gotham = loadFont('assets/Gotham Book.otf');

function setup() {

  //Create canvas
  canvas = createCanvas(windowWidth, windowHeight);
  canvas.background(hueVal, satVal, brightVal);

  //Declare Hue URL
  url = "http://" + IPHub + "/api" + userName;

  //Position ON/OFF checkbox
  checkBox = createCheckbox(' IS ON/OFF', false);
  checkBox.position(canvas.width / 2 - 90, 200);

  //Position Sliders for color control
  hueSlide = createSlider(0, 65535, 32767, 100);
  hueSlide.position(canvas.width / 2 - 225, 400);
  hueSlide.style('rotate', 90);

  satSlide = createSlider(0, 254, 127, 1);
  satSlide.position(canvas.width / 2 - 100, 400);
  satSlide.style('rotate', 90);

  brightSlide = createSlider(1, 255, 127, 1);
  brightSlide.position(canvas.width / 2 + 25, 400);
  brightSlide.style('rotate', 90);

  //Position button for 'Ambient Mode'
  button = createButton('Ambient mode');
  button.position(canvas.width / 2 - 85, 600);


function draw() {


  if (hueSlide.value() != hueVal || satSlide.value() != satVal || brightSlide.value() != brightVal) {

    //Change color

    //Capture slider value
    hueVal = hueSlide.value();
    satVal = satSlide.value();
    brightVal = brightSlide.value();

    //Change background
    colorMode(HSB, 65535, 254, 255);
    background(65535 - hueVal, 254 - satVal, 255 - brightVal);

    //Display text
    textSize(width / 15);
    textAlign(CENTER, CENTER);
    text('Huehuehue', width / 2, 100);

    hueText = textSize(width / 60);
    hueText.text('Hue', canvas.width / 2 - 160, 500);

    satText = textSize(width / 60);
    satText.text('Sat', canvas.width / 2 - 35, 500);

    brightText = textSize(width / 60);
    brightText.text('Brightness', canvas.width / 2 + 100, 500);


function toggleLight() {

  let path = url + '/lights'
  httpDo(path, 'GET', toggleGetResponse);


function toggleGetResponse(getData) {

  let lights = JSON.parse(getData);
  lightState = lights["1"].state.on

  let body = {
    'on': !lightState
  let path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, togglePutData);


function togglePutData(putData) {

  var response = JSON.stringify(putData);
  if (response.includes("success")) {
    lightState = !lightState

function changeLightColour() {
  var path = url + '/lights/' + lightNum + '/state';
  var body = {
    'bri': 255 - brightSlide.value(),
    'sat': 254 - satSlide.value(),
    'hue': 65535 - hueSlide.value()
  var path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, changeColourResponse);

function changeColourResponse() {
  console.log('Colors changed!');

function changeBG() {
  let hr = hour();
  let mn = minute();
  let sc = second();


Disobedient electronics

The theme for our second assignment was to create an object that exemplifies the ethos of disobedient electronics. I teamed up with Winnie Yoe and in our first discussion, we decided to make a few learning objectives for ourselves. Our initial list was: 1) Learn how to use ESP32 2) Learn how to fetch and display real time data 3) Use data to work with a mundane regular object that we see day to day.

Initially, we looked at the NYC open data sets and we found some interesting data around maternal health, mental health and the drug crisis. We were interested to use the data set for the opioid crisis but we realised that none of the datasets we had was not real-time and had no granularity beyond a district zone. Working with such large data-sets was proving to be challenging and we gave up on the approach.

During the discussion, we started talking about how mundane objects are basically fronts for corporations inside our homes in the name of ‘Smartness‘. That struck a chord and we refined the idea into a simple ‘smart‘bulb that is free to use but it won’t light up if the latest stock market price of the company was lower than the previous day. Going through stock market prices api, we found one which was easy to use but only gave daily stock prices. We wanted one which was hourly but in the interest of time, we went ahead with the one we found to build the proof of concept. We used the ESP32 HUZZAH to control the light bulb.

The final interaction was as follows: The bulb lights up if it detects the presence of the user and then checks for the stock price of the company (*cough* Facebook *cough*) and if the price of the company was lower, it starts blinking annoyingly. The user then has to mash the ‘like‘ button which leaves gratuitous comments on social media (not prototyped) and the bulb is ready for use again. You can watch the interaction in the video below.

I was quite happy about getting the APIs to work with the chip. I realise that there are conceptual gaps in our prototype but a lot of it was pared down in the interest of time. I believe that there is enough depth in the concept to take it further and I would like to see if I can do the same project in a more refined manner later.

A piece of velvet

I grew up in a small city in India. It was hot, dry and utterly boring in a way that only small cities in 80’s India can be. I was born premature which made me pretty sick through my early years and having no brothers and sisters, I was pretty much in my own head. And a bursting imagination often needs outlets and for me, in came in the form of playing with wooden toys. My family did not have a lot of money, so LEGOs, action figures and toys were out. But as a child, who cared? A few blocks of wood, plastic and boxes and you got a castle going! And in 80’s India, no one around me had any expensive, manufactured toys. So, it wasn’t as if I felt the need to have something that wasn’t being given to me. I was pretty happy in my own head until I saw an advertisement for a GI Joe.


GI Joes were probably the first thing I ever wanted. I was entranced and I remember throwing tantrums for having them. My parents couldn’t really afford them so they would try to keep me away but being a male child in India comes with doting grandparents and uncles who would try and cater to my whims. My frustrated parents couldn’t really say anything and in a few years, I had a collection of about 50 of them.


But this is not the story of the GI Joes.


One summer, while spending the summer vacation at my maternal grandmothers place, I came upon a box which had a small piece of velvet, 2 tiny pillows and a small piece of wood. The velvet was bedraggled with aluminum milk bottle caps stuck on it, the pillows were made out of cloth and the piece of wood, was well, a piece of wood. When I asked my grandmother about it, she told me about how that was a bed for my mother’s doll. My family is one that was torn apart from the partition of India and both my grandfathers had to leave everything they knew behind to start from scratch. So, we never had a lot of money and buying a doll was impossible. But that didn’t stop my grandmother and mother. They made dolls from whatever material they could find, built a bed, a blanket and pillows. My mom grew up playing with a stuffed piece of cloth and treasured it long after she had outgrown them. On that summer afternoon, it all came rushing to me about how entitled was I to ask for an expensive piece of plastic which was way above our means but my parents still tried to do the best they could. I felt the insides of my stomach churn and I had no way to understand what I was feeling as a child but that feeling created a sense of gratitude for them trying to do the best for me in whatever way they could. The little piece of velvet became a part of my GI Joe collection. After a long, hard day of fighting, they all were put to sleep under my mom’s velvet blanket. After all, warriors need to sleep. I often wonder what they dreamt of? What would people who fought all day dream of? Do they dream of peaceful times or more war? And under the glittering, shiny blanket, would they be happy? I did not know but it was fun to imagine that.


GI Joes unleashed my imagination. Simulation video games and construction kits later shaped my intellect, thinking and unleashed my ability to make. But my mother’s piece of velvet taught my gratitude, kindness and softness. And for that, I am grateful. Growing up as a man in India, you have a lot of hard edges as a patriarchal, masculine society shapes you to be. But a piece of cloth can round you and round you out like stones in a river. Who could have guessed?


1st week in ITP is bizarre. The floor turns into a bazaar with students hopping in and out of classes and checking Albert more than Instagram. Caught in a vortex of this hurricane that sweeps through the floor, I somehow ended up in Light & Interactivity (People who dropped the class, I owe you one!). So without much ado, here’s the first assignment.

My task: To fade an LED without using linear PWM. (It’s not the first semester anymore!)

Now, the task seemed pretty deceptively simple. All, you had to do was figure out a curve pattern, figure out the equation of the curve and voila! an expressive LED. That was until I hit an issue that is apparently, an open secret. To explain further, here is the first video:

As you watch the LED fade, trace an imaginary graph of the increase in the light with your fingers. You will come to a realisation which is this:


The curve on the left is what was used to program the LED (linear PWM) but your eyes see what is essentially an exponential growth. This article does a great job explaining the issue and some good discussion can be found here.

So, it was clear that the curve needed to be compensated for in the opposite direction to create a more linear fade. I came across this article which suggested an equation for achieving the same and it felt much better.

This seemed like a good point to try out more curves. First comes the normal sine fade from Tom’s example.

Watching this go on and off, I thought it would be cool to replicate the ‘breathing‘ light on the Mac laptops of old. Turns out, that the pattern is patented (Duh!) and Lady Ada tried to reverse engineer it but did not publish the curve equation. More on that here. If you notice the wave function on the oscilloscope, it looks like a sinusoid function with the top clipped off at the peak. I assumed that I would have to do the math for it but lo and behold! The internet giveth in abundance! Someone had written a great blog on the topic and done the math. Woohoo! Its a great post which fully explains how to derive an equation from a curve using wolfram alpha. read it here. Off I went and wrote an arduino sketch with the results as below:

I am not sure if you can see the difference but a small subtle change in the graph can create perceptible differences. After having scratched the itch of doing the macbook light, I started looking at other repos on Github and came across this repo which has a sine transition as quadratic equation. The author has a great post explaining his approach in balancing the performance and the ease of use while developing the library here.

The result looks like this:

While doing these experiments, I started thinking of the motion curves that are used for defining animations, I wondered if there were of any use. Turns out, there is an old library which has converted all of Robert Penner’s iconic work with easing curves for arduino. It was written for controlling servos, but with a few tweaks, I could get it to work with LEDs:

I did not get much time with the library but on first impression, its extremely easy to use it for any motion with an Arduino control BUT the light fades are not as pretty as the motion curves either because of perceptual differences or the need for modifications to be made to the library. I shall dig into this more later and report back.

Currently listening: Lucy in the sky with diamonds- The Beatles

Week 1: Just another basic server.

For this week’s assignment, we were asked to create a simple http server using node.js and express. Both these terms were completely new to me and I decided to keep my ambitions in check and build something that works instead of the glorious failures of ICM and P.Comp in the semester past.

For starters, I familiarized myself with node and express with Dan Shiffman’s videos. (Link)

The ‘Programming A to Z’ website also has some great explanations on working with node.js and express (Link)

For my assignment, a combination of hanging out with small bots in ‘Hacking smart toys for AI learning‘ and listening to Leonard Nimoy narrating Ray Bradbury’s ‘There will come soft rains‘, I decided to make a web server to control a bot in the following ways:

  • Make the robot move ahead. (/forward)

  • Make the robot move behind. (/back)

  • Make the robot turn left or right. (/turn/[:left or :right])

  • Make the robot dance. (/happydance)

The code was pretty uneventful except the part of constantly having to turn the server on and off. Another part which tripped me over was that home I was encountering an error when I was trying “My network IP“ :8080 instead of localhost:8080. It works like a charm inside ITP though. Maybe, it was happening because I was on a hotspot but I had no idea to rectify it. I would like to know more about how to identify and rectify such network issues.

/* References used4-line server example from Tom Igoe's class:

https://github.com/tigoe/NodeExamples/blob/master/FourLineServer/server.jsDan Shiffman's videos from coding train:


//Include express

let express = require('express');

//Create a server

let server = express();

//Serve static files from public

server.use('/', express.static('pages'));

//GET parameters

server.get('/turn/:direction', turnBabyTurn);

server.get('/forward', moveForward);

server.get('/back', moveBack);

server.get('/happydance', happyDance);

//Start the server


//Functions to send response to GET requests

//robot turn

function turnBabyTurn(request, response) {    

let newTurnState = request.params.direction;    

if (newTurnState == 'left') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else if (newTurnState == 'right') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else {       

 response.send('Something went terribly wrong. Bots are stupid like that. try left or right?');}    

response.end();}//Move back

function moveBack(request, response) {    

response.send('The bot retreats back not knowing what’s lies behind it.');    


//Move forward

function moveForward(request, response) {   

response.send('The bot whirrs forward towards an indeterminate future.');    



function happyDance(request, response) {    

response.send('The bot spins on its own axis silent and alone.');    


Currently listening: Keep Talking-Pink Floyd

Apology as a Service (AAAS)

For our first assignment for critical objects, I teamed up Winnie Yoe to work on a critical object. The shop was shut for the week and we started talking about how we should write an apology for not doing the assignment. This led us to a further discussion of how apologies are manufactured and it’s as if they are a formula.

We went down the rabbit-hole of digging up apologies from Kevin Spacey to facebook to Uber and many others and we came up with the formula as follows:

[Inspirational title] → [Demonstrate passion] → [Play the Victim] → [Feign innocence of events] → [Cautiously appreciate the victims] → [Ask for time]→ [Recognise role of company without any direct acceptance of wrong-doing]→ [Promise indeterminate actions in the future]→ [Promise that it won’t happen again]→ [salutations]→ [Actual signature].

While analysing the responses, we are realised that the formula also caters to institutional anxieties and are about protecting the organisation rather than the aggrieved ones. We came up with the idea of a service for CEOs in the future which is a voice driven interface for generating apologies. We name it Bernays after Edward Bernays, the father of modern PR quite extensively documented in The century of the self.

The hypothetical device sits on the desk of the CEO and talks to him/her about the current issue and uses advanced AI to understand the situation. It asks the CEO for ‘uncomputable‘ information which helps it create a more nuanced approach to a situation and generates an apology and a strategy for handling the situation.

You can scroll through the UI below. A sister post on the project can be found here.