What does the world think of Cozmo?

With Cozmo, it was love at first sight. SO, when the assignment for reviewing a toy was given to us, I could not think of going with anything else but Cozmo <3.

Cozmo is a very popular toy that had massive publicity when it launched in 2016. It was covered by most tech publications (Link 1, Link 2, Link 3) and people (me included!) have been going ga-ga about it. But does the toy hold up 2 years later? Let’s find out!

First stop, the cozmo website!

Cozmo is marketed as a toy with brains and personality. The home page (Link) talks about cozmo as a an accomplice that fits into your home and the first page has multiple videos of what Cozmo can do. It’s interesting to observe that the videos focus on Cozmo interacting with and doing things instead of its design and looks. Clearly, the company is confident in Cozmo’s personality as the driver for sales.

Screenshot 2019-02-22 10.17.30.png
Screenshot 2019-02-22 10.23.45.png

The second tab on the homepage is life with Cozmo which is an interesting choice. The tech tab is 4th in the navigation menu which usually is not the case. Most companies love to show of the tech first but the makers of Cozmo decide to stick with what it can do. *Applause*

One of the interesting features of animal detection is hidden inside the tech section. They could have brought it up front to convince families that Cozmo would fit into a family perfectly.

Screenshot 2019-02-22 10.26.59.png

So, brains, personality, smarts and fun engagement for the whole family (animals included!) Does Cozmo hold up to promises? let’s find out.

My first stop is amazon where it has a great rating of 4.4! Link

Screenshot 2019-02-22 10.34.39.png

Reading through the positive and negative comments, The observations are:

  • A lot of the negative issues are because of technical glitches, unresponsive support and quality control.

  • A few parents had an issue of how it was tethered to a mobile phone and they did not want their kids to be glued to a smartphone. This issue is going to be germane to a lot of smart devices in the future. How can we build stuff which does not need the mobile phone as a driver?

  • While most people love the personality of the toy, it gets repetitive after a while. How can developers keep building a personality of a device over extended period of times is a challenge and it will need to be addressed by the creators.

  • Cozmo has a high plateau of engagement. Some kids drop off very early and don’t see the point in the hassle of setting it up and playing with it. Whereas, the ones who have praised it seem to have stuck with it for a while.

So, clearly the personality is a hit but the intelligence seems to be quite basic. So i started to look at people who have been hacking into Cozmo and using it as a platform. And guess what! There is an ocean of such content!

Youtube is full of Cozmo hack videos Link

Someone made their own Link

The sub-reddit is extremely active with videos, support help, hacks, mods and what not! Link

In my observation, Cozmo has managed to create a small following of people who are invested in the platform but my feeling is that it has the same issues as the Kinect. People don’t see the value upfront as the out-of-the-box execution doesn’t hold up to the promises made but enthusiasts love it for the flexibility and extensibility it offers. I wonder what direction cozmo is going to take from here.

The biggest take-away for me after delving deep into the world of smart toys is:

  • With buzzwords like intelligent, smart and AI thrown around, are we setting high expectations from a toy which then fails to live up to the hype? How can we set realistic expectations?

  • Are we diluting what we mean by smartness? What is so smart about Cozmo as most of its behavior feels programmed rather than emergent?

  • How can the behavior of an AI toy feel more organic across time?

  • How can we design the out of the box experience in a way that connects with an impatient kid so that they does not give up on it within a few hours?

  • The creation of a personality is paramount to a smart device. I wonder how the designers, engineers and product development worked to create Cozmo’s unique personality. It’s a great case study and having worked on large teams, I can understand how hard it is to pull something like this off. The tight integration between multi-disciplinary teams is something that I would like to understand more of.

And while I chew on these questions, Here are a few lovely videos of Cozmo with animals

Huehuehue

For this week’s assignment, we had to make a web dashboard for controlling a Philips Hue bulb. I was pretty stretched for time this week so I decided to keep things simple and learn the basics. Going through the tutorial was pretty self-explanatory and the code on github was pretty logical. However, I tripped up with the call-backs and it was quite confusing for me to understand. Thankfully, I read through Timothy and Atharva’s blog and their call-backs made sense to me. I created a basic UI where the Hue can be controlled by changing it’s Hue, Saturation and Brightness values through sliders and also gave the users the option to turn it on and off. I was also changing the background of the webpage to match the color of the Hue bulb.

Screenshot 2019-02-12 04.26.39.png


I was looking at changing the color of the text to a complementary color based on the currently selected value but did not find any easy way to convert HSB values to its complementary colors. Also, I had thought of an ambient mode where the digits of HH:MM:SS are converted into a RGB hex value which is then transmitted to the hue. Again, I tripped up because I could not find a reliable way to do this. Passing RGB to HSL did not match the colors. So my top questions for this week would be:

1) How do you calculate the complementary HSB value of a color through code?

2) How do you convert a rgb color to a HSB color?

Currently listening: A whiter shade of pale - Procol Harum

/* 
References used: 
https://www.thatworkedyesterday.com/blog/2019/2/12/connected-devices-web-interface-for-hue

https://github.com/tigoe/hue-control/blob/master/client-example/public/single-lamp.js

*/

//IP address of Philips Hue
let IPHub = '128.122.151.172';
let userName = 'Your user name goes here'; // My user name as per the hue developer API
let url;

//Font
let gotham;

//Variables for display of controls and labels
let checkBox;
let hueSlide;
let hueText;
let satSlide;
let satText;
let brightSlide;
let brightText;

//Variables for controlling the philips bulb and its color
let lightNum = 1;
let hueVal = 32767;
let satVal = 127;
let brightVal = 127;

//Loading fonts
function preload() {
  gotham = loadFont('assets/Gotham Book.otf');
}

function setup() {

  //Create canvas
  canvas = createCanvas(windowWidth, windowHeight);
  canvas.background(hueVal, satVal, brightVal);

  //Declare Hue URL
  url = "http://" + IPHub + "/api" + userName;

  //Position ON/OFF checkbox
  checkBox = createCheckbox(' IS ON/OFF', false);
  checkBox.position(canvas.width / 2 - 90, 200);
  checkBox.class('lightSwitch');
  checkBox.mouseClicked(toggleLight);

  //Position Sliders for color control
  hueSlide = createSlider(0, 65535, 32767, 100);
  hueSlide.position(canvas.width / 2 - 225, 400);
  hueSlide.style('rotate', 90);

  satSlide = createSlider(0, 254, 127, 1);
  satSlide.position(canvas.width / 2 - 100, 400);
  satSlide.style('rotate', 90);

  brightSlide = createSlider(1, 255, 127, 1);
  brightSlide.position(canvas.width / 2 + 25, 400);
  brightSlide.style('rotate', 90);

  //Position button for 'Ambient Mode'
  button = createButton('Ambient mode');
  button.position(canvas.width / 2 - 85, 600);
  button.mousePressed(changeBG);

}

function draw() {

  colorMode(HSB);

  if (hueSlide.value() != hueVal || satSlide.value() != satVal || brightSlide.value() != brightVal) {

    //Change color
    changeLightColour();

    //Capture slider value
    hueVal = hueSlide.value();
    satVal = satSlide.value();
    brightVal = brightSlide.value();

    //Change background
    colorMode(HSB, 65535, 254, 255);
    background(65535 - hueVal, 254 - satVal, 255 - brightVal);

    //Display text
    textFont(gotham);
    textSize(width / 15);
    textAlign(CENTER, CENTER);
    text('Huehuehue', width / 2, 100);

    hueText = textSize(width / 60);
    hueText.text('Hue', canvas.width / 2 - 160, 500);

    satText = textSize(width / 60);
    satText.text('Sat', canvas.width / 2 - 35, 500);

    brightText = textSize(width / 60);
    brightText.text('Brightness', canvas.width / 2 + 100, 500);

  }
}

function toggleLight() {

  let path = url + '/lights'
  httpDo(path, 'GET', toggleGetResponse);

}

function toggleGetResponse(getData) {

  let lights = JSON.parse(getData);
  lightState = lights["1"].state.on

  let body = {
    'on': !lightState
  };
  let path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, togglePutData);

}

function togglePutData(putData) {

  var response = JSON.stringify(putData);
  if (response.includes("success")) {
    lightState = !lightState
  }
}

function changeLightColour() {
  var path = url + '/lights/' + lightNum + '/state';
  var body = {
    'bri': 255 - brightSlide.value(),
    'sat': 254 - satSlide.value(),
    'hue': 65535 - hueSlide.value()
  };
  var path = url + '/lights/' + lightNum + '/state/'
  httpDo(path, 'PUT', body, changeColourResponse);
}

function changeColourResponse() {
  console.log('Colors changed!');
}

function changeBG() {
  let hr = hour();
  let mn = minute();
  let sc = second();

}

Disobedient electronics

The theme for our second assignment was to create an object that exemplifies the ethos of disobedient electronics. I teamed up with Winnie Yoe and in our first discussion, we decided to make a few learning objectives for ourselves. Our initial list was: 1) Learn how to use ESP32 2) Learn how to fetch and display real time data 3) Use data to work with a mundane regular object that we see day to day.

Initially, we looked at the NYC open data sets and we found some interesting data around maternal health, mental health and the drug crisis. We were interested to use the data set for the opioid crisis but we realised that none of the datasets we had was not real-time and had no granularity beyond a district zone. Working with such large data-sets was proving to be challenging and we gave up on the approach.

During the discussion, we started talking about how mundane objects are basically fronts for corporations inside our homes in the name of ‘Smartness‘. That struck a chord and we refined the idea into a simple ‘smart‘bulb that is free to use but it won’t light up if the latest stock market price of the company was lower than the previous day. Going through stock market prices api, we found one which was easy to use but only gave daily stock prices. We wanted one which was hourly but in the interest of time, we went ahead with the one we found to build the proof of concept. We used the ESP32 HUZZAH to control the light bulb.

The final interaction was as follows: The bulb lights up if it detects the presence of the user and then checks for the stock price of the company (*cough* Facebook *cough*) and if the price of the company was lower, it starts blinking annoyingly. The user then has to mash the ‘like‘ button which leaves gratuitous comments on social media (not prototyped) and the bulb is ready for use again. You can watch the interaction in the video below.

I was quite happy about getting the APIs to work with the chip. I realise that there are conceptual gaps in our prototype but a lot of it was pared down in the interest of time. I believe that there is enough depth in the concept to take it further and I would like to see if I can do the same project in a more refined manner later.

A piece of velvet

I grew up in a small city in India. It was hot, dry and utterly boring in a way that only small cities in 80’s India can be. I was born premature which made me pretty sick through my early years and having no brothers and sisters, I was pretty much in my own head. And a bursting imagination often needs outlets and for me, in came in the form of playing with wooden toys. My family did not have a lot of money, so LEGOs, action figures and toys were out. But as a child, who cared? A few blocks of wood, plastic and boxes and you got a castle going! And in 80’s India, no one around me had any expensive, manufactured toys. So, it wasn’t as if I felt the need to have something that wasn’t being given to me. I was pretty happy in my own head until I saw an advertisement for a GI Joe.

 

GI Joes were probably the first thing I ever wanted. I was entranced and I remember throwing tantrums for having them. My parents couldn’t really afford them so they would try to keep me away but being a male child in India comes with doting grandparents and uncles who would try and cater to my whims. My frustrated parents couldn’t really say anything and in a few years, I had a collection of about 50 of them.

 

But this is not the story of the GI Joes.

 

One summer, while spending the summer vacation at my maternal grandmothers place, I came upon a box which had a small piece of velvet, 2 tiny pillows and a small piece of wood. The velvet was bedraggled with aluminum milk bottle caps stuck on it, the pillows were made out of cloth and the piece of wood, was well, a piece of wood. When I asked my grandmother about it, she told me about how that was a bed for my mother’s doll. My family is one that was torn apart from the partition of India and both my grandfathers had to leave everything they knew behind to start from scratch. So, we never had a lot of money and buying a doll was impossible. But that didn’t stop my grandmother and mother. They made dolls from whatever material they could find, built a bed, a blanket and pillows. My mom grew up playing with a stuffed piece of cloth and treasured it long after she had outgrown them. On that summer afternoon, it all came rushing to me about how entitled was I to ask for an expensive piece of plastic which was way above our means but my parents still tried to do the best they could. I felt the insides of my stomach churn and I had no way to understand what I was feeling as a child but that feeling created a sense of gratitude for them trying to do the best for me in whatever way they could. The little piece of velvet became a part of my GI Joe collection. After a long, hard day of fighting, they all were put to sleep under my mom’s velvet blanket. After all, warriors need to sleep. I often wonder what they dreamt of? What would people who fought all day dream of? Do they dream of peaceful times or more war? And under the glittering, shiny blanket, would they be happy? I did not know but it was fun to imagine that.

 

GI Joes unleashed my imagination. Simulation video games and construction kits later shaped my intellect, thinking and unleashed my ability to make. But my mother’s piece of velvet taught my gratitude, kindness and softness. And for that, I am grateful. Growing up as a man in India, you have a lot of hard edges as a patriarchal, masculine society shapes you to be. But a piece of cloth can round you and round you out like stones in a river. Who could have guessed?

Blink

1st week in ITP is bizarre. The floor turns into a bazaar with students hopping in and out of classes and checking Albert more than Instagram. Caught in a vortex of this hurricane that sweeps through the floor, I somehow ended up in Light & Interactivity (People who dropped the class, I owe you one!). So without much ado, here’s the first assignment.

My task: To fade an LED without using linear PWM. (It’s not the first semester anymore!)


Now, the task seemed pretty deceptively simple. All, you had to do was figure out a curve pattern, figure out the equation of the curve and voila! an expressive LED. That was until I hit an issue that is apparently, an open secret. To explain further, here is the first video:

As you watch the LED fade, trace an imaginary graph of the increase in the light with your fingers. You will come to a realisation which is this:

Paper.Light.4+%281%29.jpg

The curve on the left is what was used to program the LED (linear PWM) but your eyes see what is essentially an exponential growth. This article does a great job explaining the issue and some good discussion can be found here.

So, it was clear that the curve needed to be compensated for in the opposite direction to create a more linear fade. I came across this article which suggested an equation for achieving the same and it felt much better.

This seemed like a good point to try out more curves. First comes the normal sine fade from Tom’s example.

Watching this go on and off, I thought it would be cool to replicate the ‘breathing‘ light on the Mac laptops of old. Turns out, that the pattern is patented (Duh!) and Lady Ada tried to reverse engineer it but did not publish the curve equation. More on that here. If you notice the wave function on the oscilloscope, it looks like a sinusoid function with the top clipped off at the peak. I assumed that I would have to do the math for it but lo and behold! The internet giveth in abundance! Someone had written a great blog on the topic and done the math. Woohoo! Its a great post which fully explains how to derive an equation from a curve using wolfram alpha. read it here. Off I went and wrote an arduino sketch with the results as below:

I am not sure if you can see the difference but a small subtle change in the graph can create perceptible differences. After having scratched the itch of doing the macbook light, I started looking at other repos on Github and came across this repo which has a sine transition as quadratic equation. The author has a great post explaining his approach in balancing the performance and the ease of use while developing the library here.

The result looks like this:

While doing these experiments, I started thinking of the motion curves that are used for defining animations, I wondered if there were of any use. Turns out, there is an old library which has converted all of Robert Penner’s iconic work with easing curves for arduino. It was written for controlling servos, but with a few tweaks, I could get it to work with LEDs:

I did not get much time with the library but on first impression, its extremely easy to use it for any motion with an Arduino control BUT the light fades are not as pretty as the motion curves either because of perceptual differences or the need for modifications to be made to the library. I shall dig into this more later and report back.

Currently listening: Lucy in the sky with diamonds- The Beatles

Week 1: Just another basic server.

For this week’s assignment, we were asked to create a simple http server using node.js and express. Both these terms were completely new to me and I decided to keep my ambitions in check and build something that works instead of the glorious failures of ICM and P.Comp in the semester past.

For starters, I familiarized myself with node and express with Dan Shiffman’s videos. (Link)

The ‘Programming A to Z’ website also has some great explanations on working with node.js and express (Link)

For my assignment, a combination of hanging out with small bots in ‘Hacking smart toys for AI learning‘ and listening to Leonard Nimoy narrating Ray Bradbury’s ‘There will come soft rains‘, I decided to make a web server to control a bot in the following ways:

  • Make the robot move ahead. (/forward)

  • Make the robot move behind. (/back)

  • Make the robot turn left or right. (/turn/[:left or :right])

  • Make the robot dance. (/happydance)

The code was pretty uneventful except the part of constantly having to turn the server on and off. Another part which tripped me over was that home I was encountering an error when I was trying “My network IP“ :8080 instead of localhost:8080. It works like a charm inside ITP though. Maybe, it was happening because I was on a hotspot but I had no idea to rectify it. I would like to know more about how to identify and rectify such network issues.

/* References used4-line server example from Tom Igoe's class:

https://github.com/tigoe/NodeExamples/blob/master/FourLineServer/server.jsDan Shiffman's videos from coding train:

https://www.youtube.com/watch?v=P-Upi9TMrBk&list=PLRqwX-V7Uu6Yyn-fBtGHfN0_xCtBwUkBp*/

//Include express

let express = require('express');

//Create a server

let server = express();

//Serve static files from public

server.use('/', express.static('pages'));

//GET parameters

server.get('/turn/:direction', turnBabyTurn);

server.get('/forward', moveForward);

server.get('/back', moveBack);

server.get('/happydance', happyDance);

//Start the server

server.listen(8080);

//Functions to send response to GET requests

//robot turn

function turnBabyTurn(request, response) {    

let newTurnState = request.params.direction;    

if (newTurnState == 'left') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else if (newTurnState == 'right') {        

response.send('The robot makes a sharp turn to the ' + newTurnState);} 

else {       

 response.send('Something went terribly wrong. Bots are stupid like that. try left or right?');}    

response.end();}//Move back

function moveBack(request, response) {    

response.send('The bot retreats back not knowing what’s lies behind it.');    

response.end();}

//Move forward

function moveForward(request, response) {   

response.send('The bot whirrs forward towards an indeterminate future.');    

response.end();}

//Dance

function happyDance(request, response) {    

response.send('The bot spins on its own axis silent and alone.');    

response.end();}


Currently listening: Keep Talking-Pink Floyd

Apology as a Service (AAAS)

For our first assignment for critical objects, I teamed up Winnie Yoe to work on a critical object. The shop was shut for the week and we started talking about how we should write an apology for not doing the assignment. This led us to a further discussion of how apologies are manufactured and it’s as if they are a formula.

We went down the rabbit-hole of digging up apologies from Kevin Spacey to facebook to Uber and many others and we came up with the formula as follows:

[Inspirational title] → [Demonstrate passion] → [Play the Victim] → [Feign innocence of events] → [Cautiously appreciate the victims] → [Ask for time]→ [Recognise role of company without any direct acceptance of wrong-doing]→ [Promise indeterminate actions in the future]→ [Promise that it won’t happen again]→ [salutations]→ [Actual signature].

While analysing the responses, we are realised that the formula also caters to institutional anxieties and are about protecting the organisation rather than the aggrieved ones. We came up with the idea of a service for CEOs in the future which is a voice driven interface for generating apologies. We name it Bernays after Edward Bernays, the father of modern PR quite extensively documented in The century of the self.

The hypothetical device sits on the desk of the CEO and talks to him/her about the current issue and uses advanced AI to understand the situation. It asks the CEO for ‘uncomputable‘ information which helps it create a more nuanced approach to a situation and generates an apology and a strategy for handling the situation.

You can scroll through the UI below. A sister post on the project can be found here.

FAB 6: Mounting motors.

The last assignment. And probably my worst. What I wanted to build was a box with a button and a title saying “How will your day at ITP be today?” and when you press the button, the windmill moves to a random answer.

IMG_20181217_212231.jpg

However, with the final week madness and my P.Comp project being a tangle of wires and a mess of code, I didn’t have the mindspace or energy to make it. So, I probably pulled off the worst job of all time and used hot-glue, reclaimed wood and mounted the motors with basic screws to create something that works.

I did not get time to either program the random behavior or do anything else with it. The only thing that I did in this that I was remotely proud of was to turn the circular piece of wood. I call this piece “Turning it in“.

FAB 5: Two Materials.

For this assignment, I had to use two materials that work together. I have always to build a lamp which is part silicon and wood. However, I have been working with silicon a lot lately and I decided to go with epoxy so that I could try out a new material for a change.

I wanted to interplay between the relative hardness of the 2 materials. Epoxy is fluid while wood is hard and I wanted to make something which reflects that in its form. After sketching multiple ideas, I honed in on this one.

IMG_20181201_190844.jpg

I wanted to build a lamp which had wooden supports but had epoxy in the middle which lights up. Once I had a direction in mind, I got started on the wood and making the base. Wood from the junk shelf, an angled cut from the miter saw and finishing on the sander gave me 2 pieces which were perfectly matched in form.

Using plain ol’ geometry to line up the holes.

Using plain ol’ geometry to line up the holes.

The next step was to make holes. I made a through hole in the center for the LEDs to pass through and two small holes for the epoxy to flow and harden so that it doesn’t slip. (Remember Mark, you said no screws!) Once I had that, I tested an LED strip and pared it down to size.

IMG_20181207_013243.jpg

The next step was to draw an outline and start forming the mould.

Lining the paper with plastic sheet to prevent leaks.

Lining the paper with plastic sheet to prevent leaks.

Forming the first wall.

Forming the first wall.

IMG_20181207_055645.jpg
Filling up the wooden blocks with plasticine clay to prevent leakage.

Filling up the wooden blocks with plasticine clay to prevent leakage.

Forming the outer layers and using clay to position them.

Forming the outer layers and using clay to position them.

Clamping it down.

Clamping it down.

Using hot glue and plasticine to fill up blank spaces and create a water resistant mold.

Using hot glue and plasticine to fill up blank spaces and create a water resistant mold.

Getting the poxy ready. I wanted to do multiple colors to create a marbled effect.

Getting the poxy ready. I wanted to do multiple colors to create a marbled effect.

So, one would expect that it would go smoothly right? I had the mould ready, the epoxy in a glass and it was all looking good.

But I made 1 big mistake. The epoxy I chose doesn’t play well with the plasticine I used. The epoxy also undergoes an exothermic reaction which basically made the plasticine more sticky and ensured that my structure failed on me. I had taken a small nap after pouring the epoxy and woke up to the epoxy fluid seeping all over the floor in the shop and creating a huge mess everywhere. Thankfully, there weren’t any people around as it was 6 in the morning and I scrambled to clean it before John came in. I finished all the tissues in the kitchen and the loos to mop that mess up and had to walk back home smelling of epoxy fluid and defeat.

While the end of the process was a complete disaster, It was education in itself. I hope to repeat this all over again soon and make something that really works and sets.

FAB 4: Enclosure madness.

So, this week out mission was to build an enclosure. (Rubbing hands in glee)

WARNING: Long blog-post ahead.

TL,DR: I build a box for my physical computing project. It was very pretty.

This assignment segued neatly into my physical computing project. for more on the project and it’s background, please go here.

The first sketch of the box was this:

IMG_20181206_111117.jpg

Initially, I did not have any idea of the dimension or the scale of the box. So, the first stage was to finalize the puzzles and the ergonomic size which was in line with the theme. So, I did just that and finalized the puzzles and the tentative layout.

IMG_20181206_110312.jpg

I knew that I wanted the box to be big but still fit within people’s hands. A quick test with people on different sizes on the floor and I fixed on it being about 3 feet wide. I also wanted it to resemble the control panels of old on the outside. The insides needed more careful consideration. The box had to be sturdy enough to handle people playing with it so it needed enough cross-bracing. Also, the top panel needed to be swapped in and out so that the box layout could be iterated upon and also adjust the circular screen in the center. Once the dimensions were fixed, the box was built in 2 phases:

Phase 1: Acquire the buttons, knobs and dials and figure out their mounting. This part was probably my favorite in the semester. I spent a good part of 3 days deciding on the buttons, knobs and dials and trawling through the depths of the internet in acquiring them.

Once they arrived, I spent a day playing with them and Lillian spent the time measuring every small detail of them with the callipers. Bringing them over to illustrator and we were ready to do the mounting tests. At this point, it was done on both acrylic and wood so that there was enough flexibility in the future regarding the choice of materials.

Once they arrived, I spent a day playing with them and Lillian spent the time measuring every small detail of them with the callipers. Bringing them over to illustrator and we were ready to do the mounting tests. At this point, it was done on both acrylic and wood so that there was enough flexibility in the future regarding the choice of materials.

IMG_20181206_064357.jpg
IMG_20181206_112622.jpg
IMG_20181206_113025.jpg

Phase 2: Once the dimensions were figure out, the next task was to build the box. Initially, the idea was to use hardwood but we fell back on ply as it was easy to obtain in the dimensions we needed (and cheaper too). A combination of the miter saw, band saw, sander and voila!

imagejpeg_0.jpg

Once the box frame was done, we spent a week or two play testing and getting the layout right. A week before the submissions, we started the final mounts.

IMG_20181211_224045.jpg
IMG_20181212_002209.jpg
IMG_20181213_100734.jpg

All ready for the show! Let’s see how this goes <3

IMG_20181217_152128.jpg

14/14 Posenet rabbithole

As mentioned previously, my ICM finals was to design a Pose-Karaoke experience. For my motivations and background on the project, please go to the post here.

While I had major issues with getting the ICM code to work on the ml5js platform, much of it has been rectified by the maintainers of the code-base and this current example solves pretty much all the problems.

But while I was doing this, I did not have those luxuries. This resulted me in understanding how Javascript works and figuring out the issues myself. The main issue was that a P5Image does not have a img html tag that ml5js needs to be able to run the algorithm. (Funnily, it works for video. No clue why this is done this way). This was solved using an image tag. But the problem with taking a screenshot from the live video still remained. I soldiered on and found my redemption in the toDataUrl() method.

But this was the easy part.

While starting the project, I did not realise the complexity of comparing 2 poses. A lot of what I had to do relied on being able to compare 2 images and it wasn’t a trivial problem. Trawling through the depths of the internet, I came across this post by Google research where they had worked on a similar problem. This post is a wealth of information on how to compare poses and it was outside my technical ability to be able to incorporate everything in my work. But the chief things that I could incorporate were:

1) Cosine similarity: It is a measure of similarity between two vectors: basically, it measures the angle between them and returns -1 if they’re exactly opposite, 1 if they’re exactly the same. Importantly, it’s a measure of orientation and not magnitude.

2) L2 normalization: which just means we’re scaling the vector to have a unit norm. This helps in ensuring that the scale does not play a factor in comparison and the 2 images can be compared normally.

The cosine similarity helped my code run faster and the L2 normalization ensured that the relative distance from the camera won’t play a role in the comparison.

Getting these 2 things to work proved to be a big challenge and once that was done, the comparison went pretty smoothly as seen in the video below:

I ran out of time to build a complete experience for the users which involve an engaging UI but that gives me something to do for the winter break. While I could not match the scope I had set initially, I am very happy that I could dive into algorithmic complexities and solve those issues to make something working. This gives me a lot of hope for the future and my coding abilities. All in all, time well spent!

Understanding comics

For animation class, we were asked to read ‘Understanding comics‘ and reflect on what we have learnt. I have read the book many years ago and it was great to pick it up and go through it all over again after so many years. I had originally read the book before I started design school and I realised that I forgotten so many aspects of the book which emerged to me the second time around. Here is a list of my reflections that I noticed and picked up on my 2nd read-through:

  • The narrative is more immediate as compared to film. While we demand narrative coherence in film as we respond to the flow of ‘time’, a comic is free because it can move through time and space in a matter of few panels. I hypothesize that its one of the reasons why comic book plots don’t translate well on screen where the audience responds more to the flow of events across time rather than the space of a comic book.

  • The role of a narrator: I believe that the narration is the anchor which hold the comics together. Which is why I haven’t seen many works where the narration and visuals are at odds with each other. It will be interesting to see a narrative where the visuals and the narration start diverging and running at total odds with each other.

  • Panel to panel transitions: This was the biggest part of the book that I had totally forgotten. Scott Mccloud does a great job at explaining the various ways in which a narrative can work across time and space using the 2 dimensional paper grid. This got me thinking about the forms that I see back home. Is there an inherent structure to how a story manifests in a mandala or on the wall of an Indian temple? Do similar rules apply> I believe that there should be one but hopefully, I will find a book that talks about it in detail.

One of the biggest things that struck me while reading this book was that the constraint of the medium squeezes out the narrative style and structure. While the boxes might be seen as constraints by some, artists used it to tell their stories in unique ways that have now become representative of the medium. I wonder if there is a similar story with VR. While VR does not have any control over the user’s view-point, what are it’s unique constraints from which VR-only narratives will emerge?

FAB 3: Shattered, does it matter?

This week’s assignment was to make something out of acrylic using the laser cutter.

Easy peasy. I have been meaning to do a Voronoi lamp for the longest time and finally, it’s time to scratch that itch. So, off to p5 I went. Now, things are much easier compared to the last time I worked with voronois and guess what? there are libraries for that now.

After playing around with the shapes and size, I brought a few patterns that I liked into Illustrator. I played around with the dimensions of my box and tried to find a cross-section that won’t have very tiny shapes that might mess up with the laser cutter.

Screenshot 2018-11-16 14.41.39.png

Once I had that, it was time to find the material. Now, my original plan was to have 6-10 colors but looking at the costs and the availability of plastics, I brought it down to 4.

Screenshot 2018-11-16 14.46.44.png

I separated the colors into individual files for each color and off to the laser printer I went. Cutting the pieces was pretty uneventful except the part where I lost a nice slab of acrylic to the 75w printer which refused to work. (The cutting gods always demand a sacrifice) and within 45 minutes I had all my pieces. (easy peasy!)

IMG_20181115_174924.jpg
IMG_20181115_182353.jpg

Sticking the acrylic was a different monster altogether. The adhesive that I had was so runny that it was making my life miserable. But thankfully, Lydia got me out of a soup and loaned her rubber cement to me which made my life so much easier. And voila, within 2 hours I had a lamp!

IMG_20181116_150322.jpg

All I need to do is find some LEDs to light it up and it shall be AMAZING!.

Finals madness!

For my final in physical computing, my initial direction was to continue working with soft robotics. I wanted to explore the material more and create data-driven experiences that with softness, slowness and reflection as guiding principles. I brainstormed on multiple ideas and none of them felt satisfying. I was tired of being a one trick pony with silicon and nothing really felt like it was adding up to a meaningful experience. I spent a lot of time going around in circles till I gave up and focused on everything but Physical computing.

And that’s probably the best thing I did.

Not working on P.Comp gave me time and distance to think about it and combined with the happy coincidence of my friend Nun continuously going “I wish I could use all the buttons!”, it led me to a happy place that became the start of an idea that seems promising for the final project.

So here’s my 5 minute pitch:

Update: Lillian and Atharva have decided to team up with me! This gives us an actual chance to make a nuanced, complex interactive model so we have updated our deck with the new, refined idea.

We were born in the 80s and was made in the 90s.

We were born in the 80s and was made in the 90s.

It was a great time to be alive. The music was the best.

It was a great time to be alive. The music was the best.

The cartoons were definitely the best.

The cartoons were definitely the best.

The technology was clunky.

The technology was clunky.

But with the physical controls, it was so satisfying.

But with the physical controls, it was so satisfying.

It was intimidating to approach it.

It was intimidating to approach it.

But when you got it, it became a part of your muscles.

But when you got it, it became a part of your muscles.

And the sounds really brought them to life.

And we all remember clambering to rooftop antennas to fix a TV signal.

And we all remember clambering to rooftop antennas to fix a TV signal.

But the 90s also had something awesome. Insanely obtuse point and click adventures games!

But the 90s also had something awesome. Insanely obtuse point and click adventures games!

Where the instructions were minimal and the user had to play around with the interface to discover the path ahead. (In the image above, you have to throw a bone so that the fire beaver on top jumps from the ledge and then quickly pull out your fire extinguisher on it. Once the flame goes out, you can collect a key that opens a gate. Phew!)

Where the instructions were minimal and the user had to play around with the interface to discover the path ahead. (In the image above, you have to throw a bone so that the fire beaver on top jumps from the ledge and then quickly pull out your fire extinguisher on it. Once the flame goes out, you can collect a key that opens a gate. Phew!)

So, we want to build a machine where the instructions are obtuse and the users have to play around with the interface to figure out how to make the machine work.

So, we want to build a machine where the instructions are obtuse and the users have to play around with the interface to figure out how to make the machine work.

You walk up to to a machine that has a button, knob and dial garden on its face. There’s a single bulb blinking. What will you do?

You walk up to to a machine that has a button, knob and dial garden on its face. There’s a single bulb blinking. What will you do?