Showing posts with label Tasting. Show all posts
Showing posts with label Tasting. Show all posts

Wednesday, February 22, 2012

Polecat Porter Tasting

Standard on left, eised version on right.
It’s rare that I brew the exact same beer twice:  typically if I am doing a re-brew, I’ll tweak something in either an attempt to update the beer more to my liking or to try out a new process/flavor ingredient.   My original Baltic porter exceeded my expectations though and instead of trying to create an even better beer, I wanted to replicate the original so that I could proudly serve one of my favorite beers that I’ve made at NHC.  Unfortunately, I don’t think that will be the case this year…or at least in its intended form.

Although the standard version turned out to be a great tasting beer in its own right, it’s certainly not my original Polecat Porter and probably is a far cry from even being classified as a Baltic porter.   For whatever reason, the beer is lacking in the malt depth/complexity that the original possessed, the color is slightly lighter, and the body is much weaker than it should be. 

I try to keep thorough notes when I brew and after looking over both brew sessions, it appears as though my processes were identical.  It’s possible that I accidentally left out or incorrectly measured a malt, but  I’m starting to think that maybe the difference lies in the brand of malts that I used.  I never really recorded this in the past and, as an example, with SRMs of chocolate malt ranging from 350 to 600 depending on the maltster, not using the exact same grain as before really can alter the outcome of the beer.  I’m not sure if this is what happened with my beer in or not, but going forward, you can bet that I’ll be recording all of this information so that I can eliminate this variable should the problem rise again.

The good news is that, while technically still not a Baltic porter, the eised version turned out be fantastic.  Since my standard version is still lagering uncarbonated in the fridge, I plan to eis the entire keg.  It may not be the original beer that I intended to serve at NHC, but it’s a unique option and one that I think a lot of people will enjoy.

Appearance:  
Standard – Very clear, dark brown with a sort of ruby highlight when held up to the light.  Sandy head that fades quickly.
Eised – Very clear, more of a dark chestnut color, almost tobacco, and less of the ruby highlights.  Very difficult to get any head.

Taste 
Standard – Very clean and smooth with a mild chocolate flavor.  Quite a bit of soft fruit flavor from the Special B, but not in a sweet way.  Almost no roast, but there is considerable breadiness and slight toffee notes.  Clean bitterness without any hop flavor.
Eised – Compared to the standard, there’s less chocolate and more toffee/caramel.  Slightly sweeter too, but not overly so or in a cloying manner.  There’s also some warming alcohol, but no heat whatsoever.  Very rich flavors, but they’re extremely smooth and have meld together quite well.

Mouthfeel/Drinkability 
Standard – Too thin and dry to be categorized as a Baltic porter.  Sample was carbonated too quickly and needed more time to really integrate well.  No astringency.
Eised – Medium-full bodied with moderate carbonation that kept the beer from being too heavy on the tongue.  No astringency.

Overall
Standard – If I didn’t tell  you that it was supposed to be a Baltic porter, I think it would be a highly enjoyable beer….sort of a  nice, malty lager with background chocolate flavor.  To me though, it’s completely flawed because it’s not what I wanted it to be and it's hard to get past that.
Eised – This beer is great…loads of layered flavors with rich complexity.  It’s totally different from the base beer, and as expected, my original intentions, but in its own right, it’s unique and highly enjoyable.

Thursday, November 17, 2011

Autumn Maple Clone(s) Tasting

From left to right, '09 Clone, '09 Bruery's Autumn Maple, and '10 Clone
Although I brewed my first attempt at an Autumn Maple clone over 2 years ago and my 2nd attempt about 1 year ago, in all that time never once did I taste either side by side with the real thing.  I also never compared my two versions to one another at the same time.  It’s crazy to think that I allowed that much time to pass without conducting the tasing, but after realizing this, I decided it was time to put them to the test. 

Last week I hosted our monthly homebrew club meeting and it ended up being the perfect opportunity to present the three-way comparison.  Since only a few of the members had previously tasted my versions and not everyone had had the chance to try the Bruery’s Autumn Maple, I thought it would be fun to make it a blind tasting.  Since my first batch was nearly 2 years old, I was afraid to open up a fresh commercial bottle of Autumn Maple in fear that the alcohol and spicing intensities might be different.  Luckily enough, I happened to have a bottle from '09 that I squirreled away in my cellar that was perfect for the occasion.

Appearance:  Even though I poured the glasses and knew which beer was in which glass, it was clear to see which one was not like the others.  The Bruery’s beer showed significantly more carbonation, which resulted in a small layer of foam that lingered around the edge of the glass long after all the others had faded.  The two versions that I brewed were crystal clear with a slightly more orange hue than the commercial beer.  What really surprised me though was that my two clones were practically identical looking even though I used nearly twice the amount of yams in my '10 clone.  I'm not sure whether it was due to them bottle conditioning their beer or not, but the commercial version was slightly cloudy (mine were force carbed and bottled with a counter-pressure filler).

Aroma:  The differences here were very subtle.  I’d say my original clone attempt (’09) smelled nearly identical to the Bruery’s ’09, but my ’10 clone maybe had a bit of a milder spice composition. 

Taste:  I was extremely surprised at how similar the flavors were between my ’09 clone and the Bruery’s ’09 Autumn Maple.  Comparing the two, the spices in the Bruery’s version might have been a little more round with maybe a touch more molasses cookie flavor, but if so, the differences were extremely subtle.  With the '10 clone, the spices seemed even a bit more subdued and more importantly, it had this very nice warm, caramelliness/toffeeness to it.  Maybe it's due to the lower alcohol than in my first clone or maybe it's from the addition of yams in the boil, but regardless, the flavors in this beer seemed to blend together in a smoother manor than all the others and for me it was the most enjoyable to drink.

Mouthfeel:  Although the Bruery’s beer showed more signs of carbonation, I’m not sure if I really felt a difference in my mouth.  One member of our group mentioned that they thought the Bruery’s version had a slightly fuller mouthfeel maybe from less attenuation. 

From left to right, '10 Clone, '09 Bruery Autumn Maple, and '09 Clone
Overall:  Overall I was shocked at how close my two beers came to the original.  When you have them all in front of you and you're trying to determine ways in which they differ, yes, you can find minor variations.  However, between the '09 clone and the '09 commercial version, those differences were so subtle that I would be willing to bet that if I were to hand someone familiar with Autuman Maple my own ’09 clone and tell them that it was the Bruery’s beer, they'd never even suspect that it was anything but the original (aside from the clarity).  The same thing might happen with the ’10 clone, but because it had the nice rounder spice and toffee note, I wouldn't feel as confident. 

Since the above notes are more of a comparison between the beers and not actual descriptions of their components, you can see what some BJCP judges wrote about them by clicking here:  ’09 Autumn Maple Clone and ’10 Autumn Maple Clone

Friday, November 11, 2011

Summertime Gose Tasting

A few months back, I came across a post on Beer Advocate in which the OP was searching for various Goses for a tasting that he was going to be hosting.  Not only was he interested in commercial versions, but he also mentioned that he’d love to include some homebrewed varieties.  Since I’ve really only tried one other example of a Gose, I’ve never been sure how my Summertime Gose compared to the standard and this seemed like a perfect opportunity to find out.  So, in exchange for an honest critique, I sent off a bottle to UrbanCaver and the results came back a week or two later.  You can read the full post here (http://beeradvocate.com/forum/read/4136632#4150881) or just browse his review below.

Review of SonicAlligator's homebrew:

Appearance:  Pours light pale yellow. Lighter then all of the other Goses except the Portsmouth which is similar. Lots of nice white head. Well carbonate. Pretty clear.
Smell:  Smells the best of all the goses sampled. Lots of coriander adds nice lemony characteristics on the nose. A distinct smokiness on the nose. Much more complex than the other Goses but far less smokey than CCB Gose.
Taste:  Tastes lemony and tart. Can definitely taste the lacto (maybe a bit too much) Other than the CCB Gose this is by far smokier than the rest. Coriander spice is strong in this one. Gives lots of lemon zest to the brew. Combined with the lacto tartness and its a bit much in terms of lemony fruitiness. Salt is detectable but doesn't detract from the brew.
Mouthfeel and Drinkability:  Well carbonated mouthfeel. Thick and a bit more "full" of flavour than I would like for this style. Part of the thing I enjoyed about the Portsmouth and Leipziger Goses is that they were light and very refreshing. While this one is a more flavourful complex beer it is not as refreshing as others.

My own notes:  Overall, I’d say I’m fairly happy with the way this one came out.  For my own personal preference, I think the balance of acidity and saltiness is spot on and if I were to brew it again, I might cut back slightly on the coriander.  At the moment it’s present but not overpowering, but I think I’d like it to be a little bit more subtle.  I also agree on the “fullness” as reported by UrbanCaver.  While the beer doesn’t sit heavy by any means, I’d prefer it to be a little lighter feeling so that it would be more refreshing (maybe cut back on the salt?).  Additionally, and this is kind of odd, even after going back and searching for the flavor, I never really was able to pull out the smokiness that they reported.  This probably came from the few cardamom seeds that I threw in, but the flavor comes across to me as a maybe a bit more minty than smoky.  

Tuesday, September 20, 2011

Brettanomyces Experiment Results

Back in April, using Chad Yakobson’s Brettanomyces Project dissertation as a backdrop, I began an experiment to determine which concentration of starting lactic acid would yield a brett beer with the most desirable ester profile.  Yakobson’s empirical data showed that, basically, as the initial level of lactic acid increased, the level of attenuation increased while the secondary metabolites decreased.  While this information is extremely fascinating to me, it didn’t tell me where that perfect middle ground would be, or if there even would be one from a completely subjective taste perspective.  To find out, I replicatedthe study using a smaller sample size and this last week, I finally was able to look at the results.

After fermenting and aging for about four months, I decided to open up each brettanomyces lambicus sample and take the fermentative measurements.   Oddly enough, all six of the White Labs samples finished out right around 1.0043 specific gravity while all of the Wyeast samples finished out at 0.  Granted that each of my samples fermented for about 3.5x the length of those in Yakobson’s study, I still was a little surprised that the initial concentration of lactic acid didn’t play a bigger role in the final gravity of each beer.  In his Wyeast sample, there was nearly double the percentage of apparent attenuation between the 0mg/L sample and the 3000mg/L sample and yet in my results, they ended up nearly identical.  

Although the disparities between the two are not strikingly large, the Wyeast beers did end up with slightly lower pH than the matching White Labs beers.  I’m not sure if this is telling that the Wyeast strain produces slightly more acid than the White labs version or if it has to do with the slightly different levels of attenuation.  If the White Labs beers were able to ferment all the way down to 0, would the pH be more similar?  

As fun as the empirical data is to look at, my main reason for this experiment was because I wanted a subjective view of the flavor.  So, on September 15th, six of my homebrew club friends, each with sophisticated palates, gathered for a blind tasting.   Although I knew that a few of the participants had read Yakobson’s dissertation, I wanted to keep the details of my experiment a surprise and so prior to the tasting, only one other person in the group was aware of what we would be sampling.   I also wanted to be able to participate blind and so at the time of bottling, I randomly selected samples from the batch and transferred them into consecutively numbered bottles (I did keep index notes though).  Four weeks later at the tasting, when I poured each bottle into the individual sample glasses, I had long since forgotten which bottle was which beer.    

Each participant was given one sample from each of the 12 variants at the same time along with a score sheet to record their perceptions and preferences.  Since Yacobson’s study showed dramatic changes in four ester concentrations based upon the initial levels of lactic acid, I included these four flavor profiles (Ethyl Acetate – Solventy, Ethyl Lactate – Buttery/Creamy, Ethyl Caproate – Fruity/Wine, and Ethyl Caprylate – Fruity/Apple) as well as two other characteristics (Goaty/Funky and overall sourness).  In addition to these flavors, I also asked the participants to rate each sample based on what they determined to be the overall “Brettiness” of each sample as well as their overall satisfaction with each one.  All ratings were based on a 5 scale with 5 being the highest.

For each of the samples below, I compiled the scores, threw out the high and low, and then averaged the scores for each concentration level.  The results are as follows: 


 
Mean ethyl lactate scores for Wyeast seemed to increase as the initial concentration of lactic acid increased, but after 1000mg/L, the perception of the ester dropped off.  A similar occurrence happened with the White Labs strain except there was a near equal level of detection between the 1000 and 3000mg/L concentrations.  This is interesting to me because in Yakobson’s study, there was a clear correlation between the acid and ester and the concentration of the ester seemed to increase at an exponential rate as the acid increased.  In my results, we didn’t really see that and in fact, the overall highest perception occurred at 1000mg/L.

 
The mean ethyl acetate scores for my results differ considerably from what Yakobson found in his study.  Similar to ethyl Lactate, from an initial concentration of 100mg/L of lactic acid, there was a slight correlation between the ester and the acid concentration.  We see a somewhat similar patter with the White Labs strain starting at about the 500mg/L concentration, but with Wyeast, at the same mark the two are basically inversely correlated.    

 
Interestingly enough, my results for Ethyl Caproate are almost entirely opposite of what Yakobson found.  In general, this is one of the esters that he basically found to decrease in concentration as the amount of initial lactic acid increased.  My results show that our perception of the ester generally increased along with the lactic acid concentration.  It’s interesting to note that again Wyeast peaked in flavor around the 1000mg/L mark whereas with the White Labs strain, starting at the 100mg/L mark, there’s a pretty consistent increase in ester detection as the lactic acid increases. 

 
A somewhat similar ester to Ethyl Caproate, our Ethyl Caprylate detection varied by strain.  It appears as though we had a growing detection of it up until the 500mg/L point with the Wyeast strain, after which the level fell closer to the mark at 0mg/L of lactic acid.  With the White Labs strain though, our perception of it was all over the map and it does not appear that there was any correlation to the lactic acid concentration levels.  Both of these results were contrary to what Yakobson found though.  

 
In addition to looking at the specific, individual esters, I asked my participants to taste each sample and rate them based on their contentment of the overall “brett” flavor.  This is difficult concept to quantify since each participants’ perception of what brett really tastes like can differ and their individual preference for certain esters associated with brett vary, but overall, I wanted to see if there were any noticeable trends.  It turns out that with the White Labs strain, there definitely appears to be an increase in perceived “brettiness” up until the 1000mg/L mark after which the perception essentially levels off.  Although there was a big drop between 0 and 100mg/L acid concentration, starting at the 100mg/L mark we see a similar pattern with the Wyeast strain.  It appears that perceived “brettiness” increases with the lactic acid concentration up to about the 2000mg/L mark.  

So how does all of this add up?  Assuming that these results were repeatable every single time, I suppose if you wanted to tailor a beer to have a certain flavor profile, you could look above for the ester that you want to profile and then choose the initial concentration of lactic acid which resulted in the highest perception of said ester.  If you want a mixture of the esters resulting in the best flavor profile, what’s the best method?  With the data that I gathered, I went about this in two different ways.

 
In the first method, for each level of initial concentration, I took the six flavor profile scores and created one overall mean score.   As you can see in the Wyeast results above, it appears as though people thought that the greatest concentration of flavors was to be had at the 500-1000mg/L initial acid concentration level.  With the White Labs strain, the results were a little more mixed.  In general, flavor concentrations were greatest at, or above, 1000mg/L with the highest overall flavor rating at the 3000mg/L mark.  

In addition to aggregating the individual flavor ratings, I also asked each participant to rank their top three favorite samples based on overall flavor and general satisfaction.  To compute a final score and rank, for each 1st place vote, I awarded 3 points, 2nd place I awarded 2 points, and 1 point for each 3rd place vote. 
 
Although the Wyeast 3000mg/L sample came in first in terms of total number of votes, based on the scores, the top preference was actually the White Labs 2000mg/L sample.  Even though these results don’t match perfectly with the aggregated scores above, the White Labs result is fairly close since the 2000mg/L sample tied for 2nd in terms of highest score with the aggregates above.  To me though, it’s maybe more interesting that the 3000mg/L Wyeast sample received the most votes when that sample above had five other samples with higher aggregate scores.  My assumption is that this has to do with my participants’ general preference for increased acidity.  Whereas the ratings above were based on ester/flavor concentrations, the votes and scores were based on which sample they liked the best.   Since the 3000mg/L sample had a fair amount of aggregated flavor and a higher level of acidity, it makes logical sense that they might rate it higher than say a sample with a slightly higher level of combined flavors but with much lower acidity.  Unfortunately this theory doesn’t hold up so well when you look at the 3rd and 4th place samples…

Overall, the experiment was a lot of fun to conduct even though the results didn’t mirror those that Yakobson found.  I was expecting to see stronger correlations between the esters and the initial levels of lactic acid, and although we didn’t, I think the results that I found proved that perceptions don’t always match up with reality.  

Even though I attempted to recreate Yakobson’s project as best I could, I’m still fully aware that my experiment was not without flaws and the two don’t match up perfectly.  If I were to replicate it again, I definitely would like to take fermentative readings at the 35 day mark to see if the attenuation rates ended up closer to what Yakobson found.  As for the tasting, profiling six different esters between twelve different beers is a big challenge and I think I would restructure the tasting a bit to help with this task.  First, I might break the tasting down into two sessions of six samples.  While it was fun to try the White Labs and Wyeast samples side by side to see how they differ, I believe that if we were to only judge the six samples from one strain at a time, we wouldn’t run into palate fatigue as quickly and our scores might have been spread apart a bit more than they were.  Related to this, I think that using a 5-point rating scale clustered the results too much and for future ratings, a 10-point scale would allow for a little more variation.  Lastly, I failed to provide everyone with a benchmark sample first.  Without this beer to calibrate our palates, the first sample that each of us tried probably averaged three points across the board since we didn’t have anything to compare the concentrations of flavors against (it’s impossible to know though since I asked everyone to start at different samples and move about the group randomly).
Related Posts Plugin for WordPress, Blogger...