Laphroaig Tasting At Sigel’s Elite

I had some on-site meetings today that got me most of the way to the Sigel’s Elite, which made getting across town for the Laphroaig tasting much easier. It was no trouble to find, and there was plenty of parking.

I got in a few minutes after it started, and started off with the Triple Wood. I’ve had this one before, and it’s one of my favorite expressions. It goes really well with sour or tart foods, such as berry cobblers. The peat is definitely there, as it should be in an Islay malt, but with the time in sherry casks, it’s got a nice balance to it.

After that one, I played catch-up and sampled the Select expression that they’d started with. This is their attempt at recreating a pre-Prohibition whisky. It’s pretty darned good, with a much more forward oak flavor than I was expecting. I think this would pair really well with a nice bison ribeye.

Next up was the 18 Year Old, another one of my favorites. Being an older whisky, it’s definitely more mellow than its younger siblings, especially the peat. But that allows the other flavor notes to really shine. I’m not sure what I’d pair this with, because it’s one to be savored. I think that good friends and possibly a good cigar are the perfect accompaniment for a wee dram of this whisky.

Once we had finished that one, we got to taste the 2014 Cairdeas. I guess Simon was feeling the Christmas spirit, because this was completely unexpected. I’ve got a bottle of it at the house, but I haven’t opened it yet. It was definitely interesting, with a pronounced sherry note, along with some yeasty flavors. I’m not a huge fan of sherry finishes, but this one is an exception. I’m thinking it’d go well with a crème brûlée or maybe some sweet crepes.

And finally, we got a real Christmas treat as Simon opened up a bottle of the 25 year old. This is Laphroaig’s oldest normally available bottling, and one I’ve been wanting to try for a long time. It’s just been a little too expensive to buy a bottle on a whim, so this was definitely a treat. What can I say, it was fantastic. The peat’s hiding in the background, but there’s a flavor explosion going on with everything else that’s been hiding or developing over time.

Once the tasting was done, I spent a few minutes chatting with Simon before picking up a few bottles of the Select, and getting one signed. All in all, a great way to spend an evening.

Beer Tasting At Grapevine Craft Brewery

Heather and I went down to Grapevine Craft Brewery to check them out and try some tasty brews. They’re still in their Farmer’s Branch location, but are expecting to be moved in to the Grapevine facility by April 2015. They do tastings on the first Saturday of the month, and this one happens to coincide with North Texas Beer Week, which meant that we got special pint glasses with our tickets. The tour is pretty standard for a craft brewery, with a discussion on how beer is made and what the equipment does. They jazzed it up a little with some trivia questions throughout the presentation. I happened to know the term for chilling beer to get the yeast to settle out: cold crashing. I shouted it out and won a gift certificate to Finley’s Barber Shop, who were there doing shaves for Movember.

The first beer I tried was Sir William’s English Brown Ale, which was the subject of some controversy earlier this year and has subsequently vindicated itself by winning a gold medal at the Great American Beer Festival. It’s a really nice brown ale, light and fruity with a good malt finish and a gentle hopping. Plus I got to hold the actual gold medal, which was pretty cool.

My second and final beer was the Bourbon Wood Aged Nightwatch Oatmeal Stout. I’m pretty picky when it comes to oaked or barrel aged beers, especially bourbon barrel aged beers because most of them just taste too unbalanced to me. This was one of the exceptions, and a delicious exception it was. The oak and bourbon balanced nicely with the oatmeal stout flavors to create a delicious beer.

We spent some time enjoying the beers, talking to the staff and the guys from Finley’s, and nosing around the brewery. I picked up a shirt, and we got a couple koozies as freebies, plus our pint glasses. After that, we headed out to finish our Saturday errands.

Using A Proxy With Puppet’s pe_gem Module

So, I’ve defined a puppet_master role and profile to help manage some gems and other stuff that is needed there. Because the gems need to be installed in Puppet’s ruby environment as opposed to the system’s ruby environment, I installed the pe_gem module. It’s a simple module that adds a new provider, pe_gem, for the package type that replaces the gem command with the one in Puppet’s ruby.

In one of my environments, all of the nodes must access the Internet via a proxy. And that started this wacky adventure, complete with trips down multiple rabbit holes.

My first attempt was to set the environment variables. Works great from the command line, but when the agent runs, it fails. So, moving on, the documentation indicates that I could add the proxy command line arg via the package type’s install_options resource.

Yeah, not so much. The package never actually got that resource due to a bug in the pe_gem module. It turns out that it’s missing the feature definition. I fixed that locally, tested it, and then forked the module, checked in my fix, and created a pull request on Github. That sorta worked. It’ll install the gems, but the list command that Puppet does to check the version still fails because the install_options resource isn’t used for ‘gem list’ commands. At that point I set the ensure resources to ‘present’ and gave up for the night.

This morning I tried using a gemrc file to specify the proxy. Putting http_proxy in ~/.gemrc doesn’t work because Puppet unsets $HOME. Putting it in /etc/gemrc kind of works, but gem tries to do the list command without a proxy first. Not ideal, but I can live with that if I have to. Not satisfied with that though, I start crawling through Puppet’s ruby code and find the answer. Bingo! It’s not /etc, it’s /opt/puppet/etc. So I moved the gemrc over to /opt/puppet/etc, which appears to have solved everything.

PuppetConf 2014 – Day 5

I was pretty wiped out last night and slept in a little this morning, but I still got over to the conference in time to get some breakfast before heading in to this morning’s keynotes. The first speaker was Dan Spurling of Getty Images. He discussed how and why they’ve integrated puppet in to their environment, as well as some of his philosophy on development, operations, and getting everyone to play nice together. The second keynote was delivered by Alan Green of Sony Computer Entertainment America. He also talked about how they use Puppet, but he also discussed how they support the many internal groups and their extremely varied IT needs. After that, Luke came back on to do a Q&A session, which helped give us some more insight into what’s going on in the Puppet universe.

Once the keynotes were done, we headed out to our technical sessions. I started with an introduction to MCollective, which is an asynchronous, queue driven job management service that comes with Puppet. I’ve got some really good ideas on how this will be put to use on my customers’ systems. After that session was done it was time to go get some lunch. I had a little longer of a break, so I headed back to the hotel room to drop off some more exhibitor loot before returning to PuppetConf 2014 to grab some lunch and get ready for the afternoon sessions.

The first session of the afternoon was one that discussed OpenStack and how it can be managed with Puppet. It’s a pretty complex system, but there are modules out on the forge that make it pretty simple and painless to set up, configure, and run using Puppet. Next up was what was supposed to be a tour of Puppet subsystems but really turned in to an overview of part of the execution path of the Puppet agent and master code. It was interesting, but wasn’t really what I was hoping for. After that I headed over to catch a session about managing a multi-tier architecture using Puppet. It seemed like a good idea because we have a lot of that at our customers’ sites. And then there was the session put on by F5 Networks, covering their new REST API for managing their network gear. That is going to come in really useful, and considering you’ll be able to do just about everything you can do on the command line using REST calls, it’s going to rock! Our last session of the day covered Elasticsearch’s ELK platform, and was delivered by Jordan Sissel. This was a product stack that I didn’t know too much about before now, but after this presentation, I’m going to be spinning up a VM to try it out. It looks like it might be a good replacement for Splunk, with a bunch of extra functionality to boot.

It was a good way to close out the conference, and I made my way back to the hotel pretty brain fried from all of the information that has been crammed in to my head over the last 5 days. The conference was great, and I hope I get the opportunity to go to it again next year.

PuppetConf 2014 – Day 4

Even though yesterday was technically the first day of PuppetConf 2014, everything really got going today. We started off by going to the keynotes, which were definitely interesting. The first one was given by Luke Kanies, the founder of Puppet Labs. He talked about where they’ve been, what’s going on now, as well as a little about what’s coming up. After he was done, Gene Kim, author of The Phoenix Project, took the stage to talk about DevOps. I’ve had a low opinion of the DevOps methodology for a while, but after listening to him talk, I think I’m going to have to get his book and reevaluate my opinions. The third keynote was delivered by Kate Matsudaira, CEO of popforms. She went over some career management and improvement strategies.

After the keynotes, we walked around the main hall, checking out the vendors, before heading to our selected sessions. I started by checking out the demos that Puppet Labs were running, got to try out the new Node Classifier (it’s pretty amazing), and joined the Test Pilot program. After the lunch break, I started in on the sessions with one about scaling Puppet for large environments. My next session covered auditing and security related operations using Puppet, including being able to enforce basic security policies through classes.

After that session, I headed upstairs to do some last minute reviewing before taking my Puppet Professional certification exam. I can’t talk about specifics of the exam due to signing an NDA, but let me say that it was pretty challenging, making you think about the questions. I did pass it, and am now certified.

Taking the exam used up pretty much all of the rest of the afternoon, so by the time I was done and had met back up with my colleagues, it was time to head over to an off-site mixer sponsored by Puppet Labs. We all had a good time there, and got to spend some time talking to other conference attendees as well as Puppet Labs employees.

That pretty much wrapped up the day, and I headed back to the hotel to get some sleep to prepare for tomorrow.

PuppetConf 2014 – Day 3

Today was the final day of our Puppet Practitioner class. We went over classifying nodes using roles and profiles. This is something which I’ve already been implementing, and which goes a long way towards taking the pain out of classifying nodes. We also covered the MCollective framework and how it can be used. With it you’re able to execute specific classes on specific nodes, which is something I’m going to be able to put to good use for some of my clients needs.

We also covered manifest testing using the rspec-puppet and serverspec tools. I do a lot of software development in perl, which has a good testing framework that I’ve used extensively. It was a real pleasure to learn about and get to work with ruby and Puppet’s equivalent tools. It was also nice to see that there is coverage analysis data there to help me build better test cases. In my opinion, not doing this kind of testing, and not looking at coverage analysis, is one of the leading causes of poor quality code. No, the tools are not perfect, and yes, you do have to design the tests for the code, but it does allow you to test your code before it ever touches a node, and that’s a Very Good Thing™.

We wrapped up our last few class activities and then had some question and answer time with our instructor. After that, I packed up my stuff and headed down to the main hall to get registered for the actual conference. The training I’ve been taking was scheduled before PuppetConf 2014 started instead of during the conference. That means I won’t be missing any of the conference activities due to being in a classroom. So now I’ve got my swag bag, my badge, and my conference t-shirt, and will be working through a pretty solid schedule of presentations and sessions during the next few days.

PuppetConf 2014 – Day 2

Today at PuppetConf 2014 was another class day. We got in to some really interesting stuff, and by interesting I mean your eyes are about to glaze over if you’re not channeling your inner (or outer) nerd.

Custom facts. I’ve actually worked with these before, but filling in the blanks was nice. Just remember that Ruby returns the value of the last statement evaluated if you don’t have an explicit return in there. That’s also known the reason why I wasted hours and learned how to debug Puppet modules when I implemented my first custom fact.

Hiera. Learn it and use it. Especially since classes can pull their defaults from it. My next thing is to explore using MySQL as a data source instead of yaml. But either way, get that data out of your modules and into hiera. I’ve got a bit of that to do in my lab environment at home.

File manipulation. Man, this is where the fun starts, not that custom facts and hiera aren’t fun, but really, if you want to do something that makes actual, real, useful changes on your systems, this is it. Between file_line, concatenating fragments to build files, and managing config files with Augeas, you’re pretty well covered for all of your file manipulation needs. I’m going to be experimenting with using Augeas to build DNS zone files for my home lab environment really soon now. I’m starting to think that it could be used for internal DNS in a datacenter environment. It’ll be a lot easier than the manual processes that I see in use currently.

Tomorrow is our last day of class, and then the conference starts. I’ve got a certification exam scheduled for later in the week, and while I was feeling pretty good about my chances, I think this class has definitely helped out. We’ll see once I actually get to take it.

PuppetConf 2014 – Day 1

I’m spending the week out in San Francisco for PuppetConf 2014. Puppet is a configuration management system used to help keep large collections of servers configurations in sync, as well as manage application deployment, users, and pretty much any other resource you can think of.

I’m here early because I’m taking one of their instructor-led training classes in order to fill in some of the blanks in my self-taught skill set. We had a rough start today because the wireless network that the hotel set up for us was not configured correctly for running training labs. It’s hard to spin up VMs on your laptops and get them on the network when you’re limited to one IP per vPort. We did finally get it working and are mostly caught up to where we should have been for today’s lesson plans.

After getting signed in and collecting my t-shirt, I headed up to my assigned classroom. We first worked through a review of some Puppet basics and then moved on to data structures like arrays and hashes and using virtual resources to simplify complex declarations. The class, and Puppet in general, are very Git centric, which is nice to see. Since you’re, in effect, turing your site’s configuration in to code, you need to put that code under some form of revision control, and Git is probably the best system out there. I’m not going to go down that rabbit hole, but seriously, it works really well.

The hotel’s facilities are really nice, with lots of meeting spaces for the classes and even some pretty good catering for breakfast and lunch. We finally broke for the day a little after 4:00 PM, and will be back tomorrow to do this all over again. I’m already filling in some of the blanks, and have a few modules at home that need to be rewritten…

New Orleans – Day 2

Today’s plan was to take the walking tour of the courtyards in New Orleans’s French Quarter we had booked with Le Monde Créole, so we got up early and headed out to Café Du Monde for the obligatory beignets and coffee. It was really crowded, but we managed to shark a table and get our order in. The beignets were really good, as was the coffee. And to make things better, we had a some great people watching opportunities thanks to where our table was. Unfortunately, it also gave us a good view of the rain that decided to start coming down.

After breakfast, we made our way to the starting point to meet up with our guide. It had been raining off and on by this point, but was getting to be more on than off, so on the way I picked up an umbrella. The tour was interesting, though we did have to deal with some weather related disruptions. But none of that interfered with the stop at St. Louis Cemetery #1. That was the high point of the tour for me, and I used the time to take a bunch of pictures of the tombs. From there, we walked back in to the French Quarter and finished the tour off.

By this point we were both soaked, so we made our way back to the B&B, with a stop for lunch at Port of Call. It had been highly recommended by our hosts, and it was pretty clear why. The burgers were excellent, and instead of french fries, they came with baked potatoes. That was different, but pretty tasty and filling after all of the walking we’d been doing. The bar was packed, standing room only, and more soaked people kept arriving. It definitely lived up to its name today! With full bellies, we headed back to get out of the wet clothes and relax for a little while.

Dried off and feeling a lot better since the rain had finally stopped, we went back out to do a little shopping and get some dinner. We ate at the Crescent City Brewhouse, which was pretty darned good. I ordered a flight of their current offerings and really wasn’t disappointed in any of them. The waitress tried to talk us in to dessert, but we had a recommendation from one of the locals that we wanted to try out, so we walked down to French 75 to finish stuffing ourselves.

The desserts definitely lived up to the recommendations we’d received, with Heather having a crème brûlée and me having a bread pudding. I got talked in to ordering a cocktail, not usually something I like all that much, and asked the bartender to surprise me. He asked me a few questions about what I liked and then went to work. What I got was something like a Bobby Burns, but with Ardbeg, and with some amaro added in. It was delicious, and has made me re-think cocktails.

This was the end of our night, and we were beat, so we took our wiped out selves back to the B&B and crashed out.

New Orleans – Day 1

We left Tennessee early this morning and drove down to New Orleans, where we’ll be spending the weekend. We’re staying at the Banana Courtyard B&B, a nice, old house right on the edge of the French Quarter. Our room is located on the second floor, and is decently sized, with its own bathroom, and a full sized bed.

After getting checked in, our hosts took us on a short walking tour of the area, and then left us to our own devices. We wandered around for a little while and then stopped for a delicious dinner at The Old Coffee Pot. The po boys were tasty and the service was excellent, though they didn’t have the red velvet cake we were hoping to have for dessert. The chocolate cake was excellent, so it wasn’t all bad.

From there we did a little nosing around, and some shopping. I found some great hot sauces and a seriously strong horseradish. That stuff will clean your sinuses out from about a city block away. Before heading back to the room, we caught the first of the Southern Decadence parades. It was small, but a lot of fun. We’ve got a walking tour booked for tomorrow, so we wanted to have an early night tonight.