PuppetConf 2014 – Day 5

I was pretty wiped out last night and slept in a little this morning, but I still got over to the conference in time to get some breakfast before heading in to this morning’s keynotes. The first speaker was Dan Spurling of Getty Images. He discussed how and why they’ve integrated puppet in to their environment, as well as some of his philosophy on development, operations, and getting everyone to play nice together. The second keynote was delivered by Alan Green of Sony Computer Entertainment America. He also talked about how they use Puppet, but he also discussed how they support the many internal groups and their extremely varied IT needs. After that, Luke came back on to do a Q&A session, which helped give us some more insight into what’s going on in the Puppet universe.

Once the keynotes were done, we headed out to our technical sessions. I started with an introduction to MCollective, which is an asynchronous, queue driven job management service that comes with Puppet. I’ve got some really good ideas on how this will be put to use on my customers’ systems. After that session was done it was time to go get some lunch. I had a little longer of a break, so I headed back to the hotel room to drop off some more exhibitor loot before returning to PuppetConf 2014 to grab some lunch and get ready for the afternoon sessions.

The first session of the afternoon was one that discussed OpenStack and how it can be managed with Puppet. It’s a pretty complex system, but there are modules out on the forge that make it pretty simple and painless to set up, configure, and run using Puppet. Next up was what was supposed to be a tour of Puppet subsystems but really turned in to an overview of part of the execution path of the Puppet agent and master code. It was interesting, but wasn’t really what I was hoping for. After that I headed over to catch a session about managing a multi-tier architecture using Puppet. It seemed like a good idea because we have a lot of that at our customers’ sites. And then there was the session put on by F5 Networks, covering their new REST API for managing their network gear. That is going to come in really useful, and considering you’ll be able to do just about everything you can do on the command line using REST calls, it’s going to rock! Our last session of the day covered Elasticsearch’s ELK platform, and was delivered by Jordan Sissel. This was a product stack that I didn’t know too much about before now, but after this presentation, I’m going to be spinning up a VM to try it out. It looks like it might be a good replacement for Splunk, with a bunch of extra functionality to boot.

It was a good way to close out the conference, and I made my way back to the hotel pretty brain fried from all of the information that has been crammed in to my head over the last 5 days. The conference was great, and I hope I get the opportunity to go to it again next year.

PuppetConf 2014 – Day 4

Even though yesterday was technically the first day of PuppetConf 2014, everything really got going today. We started off by going to the keynotes, which were definitely interesting. The first one was given by Luke Kanies, the founder of Puppet Labs. He talked about where they’ve been, what’s going on now, as well as a little about what’s coming up. After he was done, Gene Kim, author of The Phoenix Project, took the stage to talk about DevOps. I’ve had a low opinion of the DevOps methodology for a while, but after listening to him talk, I think I’m going to have to get his book and reevaluate my opinions. The third keynote was delivered by Kate Matsudaira, CEO of popforms. She went over some career management and improvement strategies.

After the keynotes, we walked around the main hall, checking out the vendors, before heading to our selected sessions. I started by checking out the demos that Puppet Labs were running, got to try out the new Node Classifier (it’s pretty amazing), and joined the Test Pilot program. After the lunch break, I started in on the sessions with one about scaling Puppet for large environments. My next session covered auditing and security related operations using Puppet, including being able to enforce basic security policies through classes.

After that session, I headed upstairs to do some last minute reviewing before taking my Puppet Professional certification exam. I can’t talk about specifics of the exam due to signing an NDA, but let me say that it was pretty challenging, making you think about the questions. I did pass it, and am now certified.

Taking the exam used up pretty much all of the rest of the afternoon, so by the time I was done and had met back up with my colleagues, it was time to head over to an off-site mixer sponsored by Puppet Labs. We all had a good time there, and got to spend some time talking to other conference attendees as well as Puppet Labs employees.

That pretty much wrapped up the day, and I headed back to the hotel to get some sleep to prepare for tomorrow.

PuppetConf 2014 – Day 3

Today was the final day of our Puppet Practitioner class. We went over classifying nodes using roles and profiles. This is something which I’ve already been implementing, and which goes a long way towards taking the pain out of classifying nodes. We also covered the MCollective framework and how it can be used. With it you’re able to execute specific classes on specific nodes, which is something I’m going to be able to put to good use for some of my clients needs.

We also covered manifest testing using the rspec-puppet and serverspec tools. I do a lot of software development in perl, which has a good testing framework that I’ve used extensively. It was a real pleasure to learn about and get to work with ruby and Puppet’s equivalent tools. It was also nice to see that there is coverage analysis data there to help me build better test cases. In my opinion, not doing this kind of testing, and not looking at coverage analysis, is one of the leading causes of poor quality code. No, the tools are not perfect, and yes, you do have to design the tests for the code, but it does allow you to test your code before it ever touches a node, and that’s a Very Good Thing™.

We wrapped up our last few class activities and then had some question and answer time with our instructor. After that, I packed up my stuff and headed down to the main hall to get registered for the actual conference. The training I’ve been taking was scheduled before PuppetConf 2014 started instead of during the conference. That means I won’t be missing any of the conference activities due to being in a classroom. So now I’ve got my swag bag, my badge, and my conference t-shirt, and will be working through a pretty solid schedule of presentations and sessions during the next few days.

PuppetConf 2014 – Day 2

Today at PuppetConf 2014 was another class day. We got in to some really interesting stuff, and by interesting I mean your eyes are about to glaze over if you’re not channeling your inner (or outer) nerd.

Custom facts. I’ve actually worked with these before, but filling in the blanks was nice. Just remember that Ruby returns the value of the last statement evaluated if you don’t have an explicit return in there. That’s also known the reason why I wasted hours and learned how to debug Puppet modules when I implemented my first custom fact.

Hiera. Learn it and use it. Especially since classes can pull their defaults from it. My next thing is to explore using MySQL as a data source instead of yaml. But either way, get that data out of your modules and into hiera. I’ve got a bit of that to do in my lab environment at home.

File manipulation. Man, this is where the fun starts, not that custom facts and hiera aren’t fun, but really, if you want to do something that makes actual, real, useful changes on your systems, this is it. Between file_line, concatenating fragments to build files, and managing config files with Augeas, you’re pretty well covered for all of your file manipulation needs. I’m going to be experimenting with using Augeas to build DNS zone files for my home lab environment really soon now. I’m starting to think that it could be used for internal DNS in a datacenter environment. It’ll be a lot easier than the manual processes that I see in use currently.

Tomorrow is our last day of class, and then the conference starts. I’ve got a certification exam scheduled for later in the week, and while I was feeling pretty good about my chances, I think this class has definitely helped out. We’ll see once I actually get to take it.

PuppetConf 2014 – Day 1

I’m spending the week out in San Francisco for PuppetConf 2014. Puppet is a configuration management system used to help keep large collections of servers configurations in sync, as well as manage application deployment, users, and pretty much any other resource you can think of.

I’m here early because I’m taking one of their instructor-led training classes in order to fill in some of the blanks in my self-taught skill set. We had a rough start today because the wireless network that the hotel set up for us was not configured correctly for running training labs. It’s hard to spin up VMs on your laptops and get them on the network when you’re limited to one IP per vPort. We did finally get it working and are mostly caught up to where we should have been for today’s lesson plans.

After getting signed in and collecting my t-shirt, I headed up to my assigned classroom. We first worked through a review of some Puppet basics and then moved on to data structures like arrays and hashes and using virtual resources to simplify complex declarations. The class, and Puppet in general, are very Git centric, which is nice to see. Since you’re, in effect, turing your site’s configuration in to code, you need to put that code under some form of revision control, and Git is probably the best system out there. I’m not going to go down that rabbit hole, but seriously, it works really well.

The hotel’s facilities are really nice, with lots of meeting spaces for the classes and even some pretty good catering for breakfast and lunch. We finally broke for the day a little after 4:00 PM, and will be back tomorrow to do this all over again. I’m already filling in some of the blanks, and have a few modules at home that need to be rewritten…

New Orleans – Day 2

Today’s plan was to take the walking tour of the courtyards in New Orleans’s French Quarter we had booked with Le Monde Créole, so we got up early and headed out to Café Du Monde for the obligatory beignets and coffee. It was really crowded, but we managed to shark a table and get our order in. The beignets were really good, as was the coffee. And to make things better, we had a some great people watching opportunities thanks to where our table was. Unfortunately, it also gave us a good view of the rain that decided to start coming down.

After breakfast, we made our way to the starting point to meet up with our guide. It had been raining off and on by this point, but was getting to be more on than off, so on the way I picked up an umbrella. The tour was interesting, though we did have to deal with some weather related disruptions. But none of that interfered with the stop at St. Louis Cemetery #1. That was the high point of the tour for me, and I used the time to take a bunch of pictures of the tombs. From there, we walked back in to the French Quarter and finished the tour off.

By this point we were both soaked, so we made our way back to the B&B, with a stop for lunch at Port of Call. It had been highly recommended by our hosts, and it was pretty clear why. The burgers were excellent, and instead of french fries, they came with baked potatoes. That was different, but pretty tasty and filling after all of the walking we’d been doing. The bar was packed, standing room only, and more soaked people kept arriving. It definitely lived up to its name today! With full bellies, we headed back to get out of the wet clothes and relax for a little while.

Dried off and feeling a lot better since the rain had finally stopped, we went back out to do a little shopping and get some dinner. We ate at the Crescent City Brewhouse, which was pretty darned good. I ordered a flight of their current offerings and really wasn’t disappointed in any of them. The waitress tried to talk us in to dessert, but we had a recommendation from one of the locals that we wanted to try out, so we walked down to French 75 to finish stuffing ourselves.

The desserts definitely lived up to the recommendations we’d received, with Heather having a crème brûlée and me having a bread pudding. I got talked in to ordering a cocktail, not usually something I like all that much, and asked the bartender to surprise me. He asked me a few questions about what I liked and then went to work. What I got was something like a Bobby Burns, but with Ardbeg, and with some amaro added in. It was delicious, and has made me re-think cocktails.

This was the end of our night, and we were beat, so we took our wiped out selves back to the B&B and crashed out.

New Orleans – Day 1

We left Tennessee early this morning and drove down to New Orleans, where we’ll be spending the weekend. We’re staying at the Banana Courtyard B&B, a nice, old house right on the edge of the French Quarter. Our room is located on the second floor, and is decently sized, with its own bathroom, and a full sized bed.

After getting checked in, our hosts took us on a short walking tour of the area, and then left us to our own devices. We wandered around for a little while and then stopped for a delicious dinner at The Old Coffee Pot. The po boys were tasty and the service was excellent, though they didn’t have the red velvet cake we were hoping to have for dessert. The chocolate cake was excellent, so it wasn’t all bad.

From there we did a little nosing around, and some shopping. I found some great hot sauces and a seriously strong horseradish. That stuff will clean your sinuses out from about a city block away. Before heading back to the room, we caught the first of the Southern Decadence parades. It was small, but a lot of fun. We’ve got a walking tour booked for tomorrow, so we wanted to have an early night tonight.

Drill Press Issues

As we’re putting together our wood shop, we decided to pick up a drill press, and settled on the Shop Fox W1668 because it was small enough to manage, but still had enough power to do what we need. Amazon had the best price and it was prime eligible, so that made the decision on where to order easy. When it arrived, it looked decent, but there had been some serious issues during shipping. The headstock had suffered some serious impacts and was pretty badly damaged. The drive system housing was pretty badly dented and bent, and not just the door. Also, the motor housing had been bent to the point where the motor wouldn’t spin freely. There was a lot of scraping and binding when we tried to spin it by hand. All of this damage was caused by poor packaging not protecting the parts from rough treatment during shipping. Styrofoam blocks aren’t enough padding for something that weighs over one hundred pounds. And the box should have been strapped down to a pallet instead of just being double boxed. That would have protected it from quite a bit of the impact damage. Amazon did step up and send us another one when we called them, and it arrived intact.

In addition to the shipping damage there were some quality control issues that would have required us to open an RMA even if the first one had arrived intact. The column from the first unit was missing one of the four holes used to bolt it down to the base. Thankfully the one in the second box was correct, so we used that one. In addition, there was a bubble in the casting of the table that was in the second box, so we used the one from the first box.

We now have a fully assembled and functional drill press, and I’ve done some test drills to verify the functionality. It’s working exactly as expected and will be a nice addition to the shop. Just be careful when you order it…

Inside the box.

Inside the box.

Inside the box.

Inside the box.

Damaged headstock.

Damaged headstock.

The column with the missing hole.

The column with the missing hole.

The table from the second drill press, showing the casting issue.

The table from the second drill press, showing the casting issue.

Real Pirates Exhibition At Moody Gardens

We’re down in Galveston for the Independence Day holiday, spending time with the family and making sure Reese is getting some Nana time. We’re not going too crazy, but we did go over to Moody Gardens to check out the Real Pirates Exhibition at the Discovery Museum.

They’ve got a bunch of artifacts from the Whydah, a slave ship turned pirate ship from the early 1700s. There’s an exhibition around the artifacts that discusses the slave trade, piracy, and life aboard a real pirate ship, using those artifacts to illustrate. It’s amazing to see how well the items have survived almost 300 years in the ocean. Even some of the textiles were in decent condition.

A Duel Betwixt Us – A Review

After a rough kickstarter campaign, my copy of A Duel Betwixt Us arrived today. Game Salute stepped in and helped get the production back on track, with regular updates to make sure we knew what was going on. The packaging is very well done, with a large box, bagged and shrink wrapped cards, and pop-up boxes to store the cards in once they’re unbagged. Those boxes are sized so that the cards will fit whether they’re sleeved or not.

In A Duel Betwixt Us, you assume the role of a Victorian gentleman (or lady) and attempt to defeat your rival on the field of honor. Reese and I did a read through of the rules and then set up a game to play test it. The rules are pretty simple, with a quick setup and an easy turn based play mode. You draw cards that represent different weapons and armor that you have to build in order to use, as well as ones that represent events (actions that happen to you, your rival, or both of you) or tricks (used to alter the outcome of a duel). You have miners that toil for you, creating the ingots that are used to build your equipment, as well as occasionally fighting for you or being attacked by your rival.

And finally, there’s the dueling. There are a number of different duel cards, with different types of duels which each have different rules about which kinds of equipment and other cards can be used. Combat is simple, once a duel has been declared both players declare which equipment they will be using, play any tricks, and then total up the numbers on the items. The winner takes either a favor from their rival or adds two new miners to make them more ingots. When one player has taken all of the favors from the other, the game ends and they win.

We had a good time playing it, with a full game taking about an hour to complete including reading the rules and doing the setup. This should be a fast, fun game for parties or other gatherings because even the audience will be amused at the snark in the cards and gameplay.

The game box

The game box.

The box contents

The box contents.