Make Pets Found (using AI) - Day 6
If there was going to be a struggle day, this would be it. Our expedition to Granada had left us well and truly exhausted — probably not helped by half of the group taking a curious detour towards Valencia on the drive back to El Molino Del Conde. Not content with the fifteen rounds of Alhambra they’d had during the day, Matt, Sam and Jesús decided to carry on through and see what an Andalusian sunrise looks like.
Needless to say, there were some sore heads (and limbs) this morning. But what better way to recover than a good bit of exercise? After a couple of solo sessions earlier in the week, Henry ‘The HIIT-maker’ Osadzinski convinced Hugo to join him for a workout, which Matt eagerly caught on camera.
Warning: viewer discretion is advised.
Back to the job in hand, and today was about tying together ends of the application which had up until this point been worked on separately. With the end-of-week demo in sight, we had to make some decisions about what should be included and what should be dropped so that everything worked smoothly for the final demo.
Over the course of the week, the various parts of the frontend and backend aspects were slowly coming together, and today was all about making them work in unity.
With James at the helm developing the frontend app, he focused on the matching UI to reunite lost pets with their owners and got the rest of the app connected to the endpoints provided by the rest of the team. Jesús worked on getting his cat colour-categorisation work into the backend of the app, which helps classify different dominant colours identified from an image of a cat so that we can label the cat as a ginger, tabby, mink etc.
Sam focused on getting our dog breed detector model to integrate with the app, so that results from Google Cloud Vision can be enhanced by adding our own inferences from images submitted to the app.
George focused on improving our matching function, so that when users are looking for their lost pet, the search parameters (similarity, location etc) automatically get broader if their pet isn’t found straight away.
Meanwhile, Luke worked on improving categorisation for dogs in the system, and implemented a queue system using AWS SQS so that processor intensive tasks get prioritised and run correctly.
Luke, Simon and Laura also started planning how we market the application.
Following a group showing of Henry and Hugo’s workout video — there were many, many laughs — James took control of the screen to present a final demo of the app. It was exciting to see the machine learning model in action, and a weeks’ worth of hard work come to life.
After cooking the ‘last supper’, Simon excitedly revealed a festive desert he’d picked up at the supermarket. Roscón de Reyes is a Spanish pastry traditionally eaten to celebrate the Epiphany, and, as Jesús explained, whoever finds the toy king hidden inside wears the crown provided, and whoever finds the broad bean hidden inside (don’t ask) has to pay for the cake. Matt let out a celebratory yelp as he found the little king buried under a layer of whipped cream, and was duly crowned King Mateo el Lead Designer. Unfortunately for George, he was the recipient of the broad bean, and almost agreed to pay for… the entire hackathon.
We rounded off the evening with a few episodes of studio favourite Arrested Development courtesy of Luke’s generous data allowance.