This workshop explores how designers can use AI, Python, and Grasshopper to build adaptive tools to control geometry across digital and physical workflows.
Every language has the same purpose: communication between two entities. With the proliferation of AI tools, it has suddenly become much easier to communicate your needs to a computer in plain language, rather than machine code. In this workshop, we will discuss how to use AI tools to create custom applications, web pages, and one-off Grasshopper tools.
We will start with how to describe a piece of software you want to use with an AI coding tool, how to start integrating real data, and how to export that data for use elsewhere. Then, we will go over how to make a piece of software ready for combining with another using hidden tricks for smooth communication.
Then we will start with Python and how to use it in Grasshopper to experiment with 3D-printed textures. Then, we will tie it all together by building a custom computer vision algorithm that produces data ready for Grasshopper to use.
This workshop will navigate the dynamic world of AI coding to learn the most resilient coding workflows for different types of problems. The goal is an understanding of how to figure out what problems you want to solve and the optimal tools to help you solve them.
We will use real-time data where available to help inform design decisions, where, instead of pulling a static set of statistics once, you can continuously review the exact conditions of a place. This new push into real-time allows architects to focus on the ongoing condition of their building post-delivery while still designing for a day-one occupancy.
With that foundation in accessing real-time data, we will move into the ways that you can construct unique data and output combinations to fit your own needs. This will include methodologies for Computer Vision, and how to creatively interpret this data into a pattern generator for a 3D printed object in Grasshopper.
The goal is to reflect on the open-endedness of the possible combinations of data source and data interpretation. By doing this, we can create projects that are even more dynamically specific to their location and to the data that is generated and constructed within them.
Day 1:
Day 2:
No comments found.