top of page

CLUTCH-IN

inspiration

In the New York Times Opinion Article, Crawford writes, “Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images” (Crawford). This quote is a homage to the saying “what you put in is what you get out”. Crawford explains that artificial intelligence systems are the materialization of human input that can reproduce biases marginalization.

design

The Clutch-in is a kitchen appliance that takes ingredients you give it and it prepares a meal for you without having to do anything yourself. The way the machine learns what to cook is by watching what you the user does when they prepare meals via an over-the-stove camera. This data input by the user allows the system to create a repository of possible recipes. Although this machine is good for replicating past recipes, it cannot learn new ones without a human first cooking it insight of the over-the-stove camera. If someone wants to have a food they have never had before, the user needs to be the first one to cook that food so the Clutch-in can replicate it. Users may mess up a dish, but they can also delete that process from the learned recipes.

critique

In this design there is a strict following of the saying “what you put in is what you get out.” Users have complete control over the quality of food they are preparing and the flavor of it. A potential complaint about this technology is that food will only be as good as the person preparing it. If the person is a chef, they will have better food than that of a person who does not know how to cook. This technology achieves reproducibility of known information but does not bring in novel information – that is left for the user to do. For this technology to be of use, there must still be a need to prepare food from home, a knowledge of how to prepare a dish, and the availability of grocery items.

takeaway

Different from the artificial intelligence systems that are created by law officials and ad companies like Google, the producers of the artificial intelligence systems have no incentive to achieve and equitable algorithm because they are not directly negatively affected by it. If developers would be subject to the artificial intelligence software they are creating, developers would be more hesitant to society. The Clutch-in kitchen appliance provides a direct incentive to inputting quality data into a machine because it will determine what the users receives to eat.

Crawford, Kate. Artificial Intelligence's White Guy Problem. Op-ed, New York Times, June 25, 2016. https://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html.

bottom of page