Neural networks for dialing in espresso

Coffee preparation techniques besides espresso like pourover.

#1: Post by CasualRei_ »

I was recently inspired by a video by Tate Mazzer, a barista/software engineer. For a Uni project, he used computer vision and neural networks to esses the quality of a shot based on footage of the basket as the shot is being pulled.

This concept could easily be applied using flow and/or pressure as a metric instead and would also be accessible to anyone with a Bluetooth scale.

I'd essentially like to put the feelers out. If any members of this forum would be interested in sharing their profiles with me from the Smart Espresso Profiler App for me to use as a training data set I'd love to see if at the very least I can be able to recommend finer or coarser grind settings based off a shots performance but, there would definitely be uses beyond that...I'm looking at you Decent and meticulous users.

This might be a divisive topic I'm sure..Personally, I'm very happy with my current dialing in process and a project like this would be purely out of curiosity.

Lastly, if anyone has not seen Tates channel I highly recommend it. The content he uploads is unlike any other creator in the coffee space and I'm sure the modders of this forum would appreciate it.

Tates Socials:


#2: Post by BodieZoffa »

Clearly some people need guidance trying to improve what they currently do, but the tech world isn't the do all/end all approach in my opinion. Some will constantly try to reinvent the wheel and end up chasing their tail in that vicious pursuit that usually ends in utter failure.


#3: Post by espressoren »

Interesting- there are a lot of people on these forums looking for advice on things like shots flowing too quickly or sour results, it would indeed be nice to have something people could use to auto diagnose and make suggestions.

I think it's a lot easier to do for something like footage of the basket - it is easy to see if the bottomless basket is extracting evenly, or spitting, or whatever and make an assessment. It is harder for something like flow or pressure where people are actively encouraged these days to try different rates, pressures, and profiles. Not necessarily one correct way. One could possibly have a target profile and assess how close the user got to it, but that doesn't necessarily require AI.

CasualRei_ (original poster)

#4: Post by CasualRei_ (original poster) »

I would agree with you. The method Tate used for his project probably can be much easier to tell faults in puck prep, which is a huge factor in how well your shot is going to perform.

There are a couple of reasons why I'd be hesitant out of the gate to immediately go down that path.
The first would be that it is a unique and original idea by Tate, he has clearly put in a fair amount of work, and whether he intends to do anything with it beyond the paper, for someone to copy that work and make it available to the public (which I very much intend to with this project) is not something I'd personally support.
Another reason is that I think for anyone who would find this useful, adding a camera to your espresso setup is another barrier to entry. If someone already has access to pressure and flow data or ideally even just simply a scale they could already participate.

I do think there is a fair amount of value we can get from Dose in, Dose out, Pressure over time, and Flow over time. Even if the beginning of that is getting results = Good Shot/Bad Shot then there is more to expand to.

At the end of the day, this project is less about brewing the perfect shot of espresso and more about curiosity and figuring out how new passions (computer science) can augment old passions (delicious coffee) and also sharing that process with others.

User avatar
Team HB

#5: Post by Jeff »

Probably the largest source of data is Miha's

To me, this is a "cook vs. chef" problem, once you're out of the obvious problems†. It is a cook's task to execute on a recipe, that of the chef to define it and adjust it as ingredients change. There is plenty on the Internet and other lore that a perfect espresso shot is "18 g in, 36 g out, in 25 seconds" (pick your numbers). This is a cook's task. If you can't get there within two or three shots, you've got problems with your ingredients, equipment, or skill. However, to get very good or better espresso, you need a chef -- that involves the ability to taste and evaluate.

The people who would most benefit from something like this often haven't developed the sensory skills needed or the knowledge of what the end-point might be. Without good inputs (the sensory data), no matter how good your model is, the outputs will not be very useful.

† The obvious problems include:

* Gear that is unsuitable for preparing reasonable-quality espresso, either in capabilities (such as pressurized baskets, "15 bar" machines, crap grinders) or shot-to-shot variability.

* Unsuitable ingredients, especially "supermarket" coffee, even whole bean

* Inappropriate expectations, for example expecting cheap, commodity-grade coffee that was roasted into second crack to not be bitter

* Basic skill problems, typically poor prep (either under- or over-prepped)

* Failure to follow common directions, with "grind finer" being near the top of the list


#6: Post by coyote-1 »

Now all we need do is pair the Decent/Meticulous with a grinder that can decipher via AI the exact grind that will result in the best possible extraction. It will of course need a spectrum analyzer to glean the composition of the roast you're using, in order to make the correct adjustments. Add automation to the Moonraker so that it utilizes the data gleaned during the previous process to optimize the distribution in the puck. Then add infrared guides to the portafilter, so it is always perfectly inserted into the machine. Perhaps IBM's Watson, with its stunning analysis capabilities, can be brought in to assist.

In this way, we may finally get a cup of espresso that we can tolerate.

Team HB

#7: Post by JRising »

coyote-1 wrote: Add automation to the Moonraker so that it utilizes the data gleaned during the previous process to optimize the distribution in the puck
You're on to something...
You could add a stepper motor (we could debate which motor and which driver is best for at least 3 pages) to the Moonraker, and sell them for $950 a piece to the first suckers customers/testers who could then debate the best rotational speed for a month or two (We can't trust that AI will be correct, it will be making most of it up) until Hoffman tells us all his opinion and Dan has to cooldown the topic for 24 hours because I'll just have to give my opinion. :roll:
★ Helpful