Yearly State of the Game Room 2017

And here we are again. The yearly State of the Game Room. With the move to a new place, I got to move all the games into a single room which is a positive thing and lets me display them all rather than jumping between a couple of rooms to locate a game. I also have a lot better lighting in this room making gaming a lot easier.

I’ve added a good 6 Kallex shelf squares of games over the year. Bunny Kingdom and The Thing are two of the ones we enjoyed the most and the ones we’ll be taking to the end of year bash. Another one we played was The Others. It was a pretty interesting game with lots of bits but it really got a bit long a couple of times. I can do a couple or three hours of a game before I start wanting to end it.

List of games I can remember for the year by looking at the shelves. I’m sure I missed a couple especially since we moved this year. I do have an inventory system so I can keep track but I haven’t been keeping up on it. I’m working on getting it updated over the next few weeks or so. Maybe I’ll have a better list for next year 🙂

* 7th Sea RPG
* 7 Wonders: Duel Pantheon
* Apocalypse Prevention, Inc RPG
* Arkham Horror the Card Game
* Blood Bowl
* Bottom of the 9th
* Bunny Kingdom
* a few Call of Cthulhu books
* Castles of Burgundy Card Game
* Charterstone
* Clank
* Conan RPG
* Conan Board Game
* Dark Souls
* DC Deck Building Multiverse Box
* Dead of Winter Warring Colonies
* Dice Forge
* a few Dungons and Dragons books
* Elder Sign expansions
* Eldritch Horror expansions
* Evolution Climate
* Exit
* Fallout
* Fate of the Elder Gods
* Feast For Odin
* First Martian
* Five Tribes expansions
* Fragged RPG
* Gaia Project
* Jaipur
* Jump Drive
* Kemet
* Kindom Builder
* Kingdominos
* Kingsport expansion
* Legendary expansion
* Lovecraft Letters
* Magic Maze
* Mansions of Madness 2nd Edition expansions
* Massive Darkness
* Mountains of Madness
* My Little Pony RPG
* Netrunner packs
* New Angeles
* The Others
* Pandemic Second Season
* Pandemic Iberia
* Pandemic Rising Tide
* Paranoia RPG (Kickstarter)
* a few Pathfinder books
* Queendominos
* a couple of Red Dragon Inn expansions
* a few Savage Worlds books
* Scythe expansions
* Secret Hitler
* a few Shadowrun 5th books
* Sherlock Holmes expansions
* Cities of Spendor
* Talon
* Terraforming Mars
* The Thing
* Ticket to Ride Germany
* Time Stories
* Twilight Imperium
* via Nebula
* Villages of Valeria
* Whistle Stop
* Whitechapel expansion
* Whitehall
* a few X-Wing minis.

After getting things organized and making room for the next year, I have pictures.

Posted in Gaming | Leave a comment

‘If You Were Smart, You’d Be In Engineering’

This is a comment I heard a few years back from a manager on the Engineering side of the house. Of course it’s stated in a bad way, but the words behind it is that to advance in technology, your next step should be in Engineering (or Development) and you should be working towards that.

As a Systems Admin for many many years, I’ve found that I’m good at the job. Managing systems and providing solutions in Operations to improve the environment. It also plays to my own strengths and weaknesses. Troubleshooting problems, fixing them, and moving on vs spending all my time on a single product as a programmer or engineer. I’ve had discussions with prior managers about advancement both in Engineering and in Management. I’ve even attended Management Training. After discussion, it was felt that I should work with my strengths.

A few years back, I was moved into an Operations Engineer role. The company started implementing a Plan, Build, Run model and I was moved into the Build team. No real reason was given for the choices. The concept was to move a portion of the team away from day to day maintenance duties in order to focus on Business related Product deployments. It’s taking too long for the entire team to get Products into the field, what with being on call and dealing with maintenance issues like failing hardware so let’s pull three people away from the team to be dedicated on Business related Projects.

Sadly there was no realization that the delays are in part due to multiple groups in Operations having their own priorities. Delays were more due to the inability of the groups to fully align on a project. With three business groups each with their own priorities, the time to deploy really didn’t change much and there were fewer people to do the work.

As we moved forward, the company redefined the Build role. Titles change and we’re Enterprise Systems Engineers. One of the new goals was to move away from server builds and into server automation. Build one or two servers for the project in order to create automation and then turn the rest of the project as a Turn Key solution, over to the Run team to complete.

In addition, a new project came along and was designed to follow the Continuous Integration/Continuous Delivery (CI/CD) process. The concept is to deploy software using an incremental process vs the current 6 month to 18 month process.

The current process is to gather together all the change requests that identify problems in the product plus the product enhancement requests and spend time creating the upgraded product. It can take months to get this assembled, get code written and tested, fix problems that crop up, get it through QA testing, and then into Operations hands. And as noted before, deployment itself can take several months due to other projects on my plate and tasks on other groups plates.

The CI/CD process means that there’s a pipeline between Development into Operations. A flow, picture a stream. A problem is discovered, it’s passed back to development, they fix it, put it into the flow, automated testing is performed to make sure it doesn’t break anything, it’s put into the production support environment, then production. Rather than taking months to assemble and fix a large release, you have a single minor release and you can identify, fix, and deploy it into production in a matter of minutes. Even better, the Developer or Engineer would be on call for their product. If something happened, they’d be paged along with the teams, could potentially fix it right then, put it into the flow and let the fix get deployed. Automated testing is key here though and part of the fix is to fix or add new tests to make sure it doesn’t happen again or even with other products.

This process is pretty similar to how a single programmer working on a personal project for example might manage their program(s). I have a project I’ve been working on for years and has reached about 140,000 lines of code (actually about 98,000 lines of actual program with the remaining 42,000 lines being comments or blank lines). I’ve created a personal pipeline, without automated testing though. And an identified problem can be fixed and deployed in a few minutes. New requests can generally take about 30 minutes to complete and push out to the production locations.

This CI/CD oriented project was interesting. Plus it included creating a Kubernetes environment. I’d had someone comment about DevOps several years earlier so I’d been poking about at articles and even reading The Phoenix Project, a book about CI/CD and DevOps. A second project followed using CI/CD and Kubernetes. A part of the CI/CD process was utilizing Ansible and Ansible Tower, a tool I’ve been interested in using for a few years.

As a Systems Administrator, one of our main responsibilities is to automate. If you’re doing a task more than once, automate it. Write a script. I started out hobby programming, then working part time and then full time, back in the early 80’s. Even though I moved into system administration, I’ve been hobby programming and writing programs to help with work ever since. I believe it’s been helpful in my career as a Server Administrator. I have deeper knowledge about how programs work and can write more effective scripts to manage the environment.

In the new Build environment, it’s been even more helpful. It lets me focus even more on learning automation techniques with cloud computing (I’ve created Amazon Web Services and Google Cloud Services accounts) and working towards automating server builds. Years back I started creating “Gold Image” virtual machine templates. I also started in on scripting server installations. The rest of the team has jumped in and we have an even better installation script framework. Lately I’ve been working on a server validation script. For the almost 1,000 systems we manage, the script is reporting on 85,000 successful server checks with errors called out that can be corrected incrementally by the team (a local login report plus a central reporting structure). As part of this, I’ve started creating Ansible Playbooks and added scripts to our Script framework to make changes to systems. Eventually we’ll be able to use this script to create an automation process to make sure servers are built correctly the first time.

It’s been fun. It’s a challenge and sort of a new way of doing things. I say “sort of” because it still falls into the “automate everything” aspect of being a Systems Admin. With Virtual Machines and Docker Containers, we should be able to further automate server builds. With scripts defining how the environment should look, we should be able to create an automatic build process. Plus we should be able to create a configuration management environment as well. A central location defining a server configuration that ensures the server configuration is stable. You make a change on the server and the configuration script changes it back to match the defined configuration. Changes have to be made centrally.

DevOps though. It seems like this is really “Development learns how to do Operations”. “If you were smart, you’d be in Engineering.” Many of the articles I read, many of the posts on the various forums, many of the discussions I’ve followed on Youtube and in person really seem to be focused on Development becoming Operations. There doesn’t seem to be a lot of work on how to make Operations part of the process. It seems to be assumed that a Systems Admin will move into Engineering or Development or will move on to an environment that still uses the old methods of managing servers. Alternately we’ll be more like COBOL programmers. A niche group that gets called in for Y2K like problems.

I like writing code and I’m certainly pursuing this knowledge. I have a couple of ESX servers at home with almost 100 VMs to test Kubernetes/Docker, work on how to automate server deployments, work on different pipeline like products like Jenkins and Gitlab. I use it for my own sites in an effort to get knowledgeable about such a role and am starting to use them at work. Not DevOps as that’s not a position or role. That’s a philosophy of working together. A “DevOps Engineer” is more along the lines of an Engineer working on pipeline like products. Working to get Product automated testing in place but also working on getting Operations related automated testing in place.

If you’re not taking time to learn this on your own and the company isn’t working on getting you knowledgeable, then you’re going to fall by the wayside.

Posted in Computers | Tagged , , , , , , , , | Leave a comment

git Version Control for rcs Users – Setting up gitlab

Next up is to install Gitlab. The problem will be the inability to access to ‘net from work. So a manual install will need to be done. We’ll see how that goes.

Gitlab is pretty easy to install. Just follow the website. I’m going to build the server, generate an RPM list and a file list, install gitlab and the run a diff to see what changed. It may or may not help in setting up the gitlab server at work.

Comparing the before and after files and rpm listing, there were no differences before the install. Doing the installation now per the website shows the following installation and dependencies:

 gitlab-ce                                   x86_64                      10.1.3-ce.0.el7                       gitlab_gitlab-ce                      353 M
Installing for dependencies:
 audit-libs-python                           x86_64                      2.4.1-5.el7                           jumpstart                              69 k
 checkpolicy                                 x86_64                      2.1.12-6.el7                          jumpstart                             247 k
 libsemanage-python                          x86_64                      2.1.10-18.el7                         jumpstart                              94 k
 policycoreutils-python                      x86_64                      2.2.5-20.el7                          jumpstart                             435 k
 python-IPy                                  noarch                      0.75-6.el7                            jumpstart                              32 k
 setools-libs                                x86_64                      3.3.7-46.el7                          jumpstart                             485 k

For a local installation without access to the ‘net, as long as I install the above dependencies on the server and then the gitlab-ce rpm, I should be good.

Rebuilding here in a sec to test my theory.

Confirmed. I installed the above packages and once done, installed the gitlab rpm as noted on the website:

export EXTERNAL_URL=http://lnmt1cuomgit1.internal.pri
rpm -ivh gitlab-ce-10.1.3-ce.0.el7.x86_64.rpm

Ready to be used…

Jenkins next…

Posted in Computers | Tagged , , , , | Leave a comment

git Version Control for rcs Users – Configuring gitlab and Jenkins Servers

This is intended to document the steps involved in moving from an RCS based environment to a git based environment. In previous posts, I showed the environment I’m currently working in and the progress to changing from using RCS to using git. I duplicated the process in installing code to the target servers so that I can now use git from the command line very much like I used rcs and my scripts. Now I want to move a bit farther into the devops side of things, learning about setting up an easier way for the team to manage the scripts and code, providing a method for external teams to access some of the code, and replacing my sync scripts with Jenkins.

I’ll need to duplicate my environment again, this time with even the same server names. Since I have an ESX host at home, I can set up several servers that effectively match my official environment.

* git server: lnmt1cuomgit1
* Jenkins server: lnmt1cuomjkns1
* Target Tool server: lnmt1cuomtool1
* Target Status server: lnmt1cuomstat1
* Target Inventory server: lnmt1cuominv1
* Worker server: This will be a local, virtual box system where I can pull and test scripts and the two websites.

Again, this is partly to help me manage my personal sites better, but also to help with the work effort to get us more focused on Dev type tasks.

The Source server will have gitlab and Jenkins installed since Jenkins uses port 8080, they both should be able to cohabitate. Update: And after testing, that’s a big nope. So two different servers.

At least for the php site (Inventory), I’ll need to do a bit of rewriting as the settings.php file also contains the passwords to the database. Either the settings file needs to be replaced by a table or the passwords will need to be pulled from an external file outside of the site directory structure. Main bits right now though are the scripts the team actually uses. The Inventory has report scripts, data uploads, and of course the front end to the database which the scripts don’t touch.

Environment Specs:

Server CPUs RAM Disk /usr /opt /var
lnmt1cuomgit1 2 4 40 4 10 4
lnmt1cuomjkns1 2 4 40 4 8 grow
lnmt1cuomtool1 2 4 50 16 1 16
lnmt1cuomstat1 2 4 40 4 1 grow
lnmt1cuominv1 2 4 40 4 1 grow

On to gitlab…

Posted in Computers | Tagged , , , , , | Leave a comment

Cajun Shrimp and Salmon

Quick recipe.

Prep time
10 mins
Cook time
20 mins
Total time
30 mins

Sweet and savory pan-seared salmon topped with sautéed shrimp in cajun butter sauce. Salmon New Orleans is an unforgettable 30 minute meal your family will crave!

Author: Tiffany
Recipe type: Main dish
Cuisine: American
Serves: 4

4 6-ounce salmon fillets
salt and pepper, to taste
1 pound large shrimp, peeled and de-veined
8 tablespoons butter, divided
2 tablespoons honey

cajun seasoning
½ teaspoon salt
1 teaspoon garlic powder
1 teaspoon paprika
¼ teaspoon pepper
½ teaspoon onion salt
½ teaspoon cayenne pepper
heaping ½ teaspoon dried oregano
¼ teaspoon crushed red pepper flakes


In a small bowl stir together all cajun seasoning ingredients and set aside.

Season salmon with salt and pepper to taste. Melt 2 tablespoons butter in a large skillet over medium-high heat. Add honey and whisk to combine (mixture should be bubbly). Add salmon fillets to pan and cook for 5-6 minutes, then flip and cook another 7-8 minutes until salmon is cooked through, flaky, and browned. Transfer salmon to a platter and cover to keep warm.

Add remaining butter to the pan over medium-high heat. Once butter is melted, stir in cajun seasoning. Add shrimp to pan and saute until opaque, about 5-6 minutes.

Serve salmon topped with shrimp. Drizzle any extra pan sauce over the top and garnish with chopped parsley if desired. Serve immediately.


Fill a large pot with water. Bring water to a rolling boil and let it boil for 5 minutes. Drop shrimp in the boiling water. Once the shrimp turns pink and floats to the top it is cooked and ready to be removed from the pot. Use a metal strainer to remove the floating shrimp and transfer to a bowl.

Posted in Cooking | Leave a comment

Refinishing The Deck

The deck at the house is a bit dried out and aged looking. We’d been on the lookout for someone to do the siding and decks as protection is needed here in the mountains. We’re closer to the sun and elements. As we were passing one of the homes up here, we spotted one of the yard signs for Mountain Woodcare. I called them and had them come out and provide a quote on refreshing the siding and deck. The owner, Jeremy (“J” or “Jay”) came out and after a pretty thorough review, he suggested the siding didn’t need to be done right now but the decks could certainly use some TLC. With the estimate and a quick discussion with Jeanne, we approved the work and they came out Monday last week to get started. We moved all the furniture and such off the deck so they could get right to work 🙂 Over the course of the week, they were out every day stripping the old paint off the rails and power washing everything with chemicals to get things nice and clean and ready for application of the stain and oil based sealant. There was even some sanding that was needed. By Friday they had it all done and it looked excellent. Of course I took before, during and after pictures because I like to be able to compare and see the improvements.

We’d discussed work on the decks with the previous owners and again with Jay. Jay said he recalls coming out to give an estimate but no followup. Based on the looks, it’s been 3 or 4 years at least since any maintenance was done on the decks. Fortunately a bit of maintenance and TLC brought the decks back to beauty.

We have essentially four decks. A Kitchen deck, a Master Bedroom Deck, a lower deck that fronts the entire house, and an Entryway deck (upper by the garage, stairs down to the front door and a deck that wraps around to the MBR bathroom). I’ll present each as a block of Before, During, and After pictures.

And as a note Jay was nice enough to take a few extra minutes to power wash the Gazebo. We’ll hit it with some fresh paint this week.

Kitchen Deck – Before

We were trying to show how dry all the wood was before the team got started. All the Before pictures try to catch it at different times of the day so you can see the differences.

Kitchen Deck – During

You can see the wood railings have been stripped and the deck washed. See how the water soaks in to the wood?

Kitchen Deck – After

And After looks great. Jay recommended a bit of a tint to the oil vs a clear oil as UV protection. The wood is apparently Brazilian Redwood. The thing to note later is the beading up of the water on the freshly oiled deck. Looking good.

Master Bedroom Deck – Before

Master Bedroom Deck – During

Master Bedroom Deck – After

Entryway Deck – Before

Entryway Deck – During

Entryway Deck – After

Lower Deck – Before

Lower Deck – During

Lower Deck – After

Posted in Colorado, Deck Refinish, Home Improvement, Rocky Knob | Leave a comment

git Version Control for rcs Users – Synchronization

Now that I can check out files, edit, and check them back in. The last step is syncing the files with the target server or servers. I’m trying to eliminate the extra static files/put them into the repo vs having them be a second area to manage. Part of the problem is other teams. We want to be able to have them manage files without having to log in to the git server and manually touch the old static files.

It works pretty much the same as the previous configuration.

Copy the unixsvc public key to the target server(s).

Set up a script to do a pull and check for the error code.

If changes, use rsync to sync the data across.

Simple enough script. Set up a cron job to run every minute and the target server(s) will always be updated.

Need to test the heck out of this to make sure it works as expected. Add the other projects, less the inventory and status ones (they’re websites). And finish documenting it so I can enable it at work.

Next up, gitlab and jenkins. Let’s try this through a web interface using “normal” DevOps.

Posted in Computers | Tagged , , , | 3 Comments

git Version Control for rcs Users – Setup and Usage

At least for someone like me where I’m the only person working on projects, the setup and usage of RCS and git are pretty straightforward. Once we get into team usage, it gets to be a bit more complicated. Right now the team can check out and check in a script but due to permissions, they aren’t able to sync the repositories. Fortunately the scripts do that every minute (checking for the flag file) but it’s a bit cumbersome.

Setup ssh git

There are a few bits that need to be done in order to get git set up.

1. Create the git user on the git server. Make sure you have sufficient space for all the code. I created a 30 gig slice in /opt/git and used it for git’s home directory.

useradd -c 'Git Service Account' -d /opt/git -m git
passwd git

2. You’ll need to add your public keys into git’s .ssh directory as ‘authorized_keys’. Do this for every server you will be pulling files from.

3. Create the Master repository on the git server. You won’t need to put any code into the directory but you do need to run ‘git init –base’ to initialize it.

mkdir /opt/projects
for i in suite inventory status httpd kubernetes changelog admin newuser
  mkdir /opt/projects/$i
  cd /opt/projects/$i
  git init --base

The “–base” option indicates this is a master repository, not a user’s working directory. The working directory will be on your home system.

4. On your home system, create the local or working repository.

mkdir projects
cd projects

5. If you’re creating the first repository as I would for the ‘suite’ scripts, make the ‘suite’ directory and initialize it. You’ll want to set a couple of variables as well.

mkdir suite
cd suite
git init
git config --global 'Carl Schelin'
git config --global

6. Since I’m converting existing RCS files, I want to bring all the previous changes into git. I’m using rcs-fast-export, a Ruby script that imports all the RCS changes into git. You’ll want to run the script in the directory.

rcs-fast-export.rb . | git fast-import && git reset

Note – this script isn’t working for the inventory application. I suspect it’s because I have three places where the same file name is used but for different purposes. Do some testing before you commit the updates.

7. Once done, push the code up to the git server. This will depend on what you have set up to manage repositories.

git push git@lnmt1cuomgit1.internal.pri:projects/suite

And you’re done. The project is ready for the team to retrieve and manage.

Team Setup

1. Very similar to above, each member of the team will need to copy their ssh keys over to the git server. Concatenate it with git’s authorized_keys file.

2. Create a projects directory. Don’t forget to set your git environment variables.

mkdir projects
cd projects
git config --global 'Carl Schelin'
git config --global

And you’re ready to edit code.

Managing Code

1. Before you can do anything, you’ll need to retrieve the git project. For the first time, you’ll have to use the clone options.

git clone git@lnmt1cuomgit1.internal.pri/projects/suite

This will retrieve all the files associated with the project you want to manage.

2. For subsequent updates, you’ll want to pull files from the server.

git pull git@lnmt1cuomgit1.internal.pri/projects/suite

3. You’ll now have a ‘suite’ directory. Within that are a ‘bin’ and ‘etc’ directory. Files in these directories are managed by git. As you know from RCS, you need to check out and check in changes. Use the ‘checkout’ keyword to begin editing the files. Change to the ‘bin’ directory and check out the ‘chkserver’ script.

cd suite/bin
git checkout chkserver

4. You can now edit the file. Once done, you’ll need to check it back in. It’s a two step process. You need to ‘add’ it back in and then ‘commit’ the change.

git add chkserver
git commit chkserver

Your editor of choice will display the current ‘git status’ as comments. Anything that’s not a comment will be added to the gitlog.

5. Once done, you’ll need to upload changes to the master.

git push git@lnmt1cuomgit1.internal.pri:projects/suite master

git Commands

List of commands that you’ll find useful. I’ll add more as I explore.

  • git status – Show the status of the project
  • git log – Show the commit log for the project or if you pass a file name, shows the commit log for the file.
Posted in Computers | Tagged , , , | Leave a comment

git Version Control for rcs Users – Background

As a Unix Systems Administrator, I’m a long time user of Revision Control System (RCS) to manage configuration files. My first time was in managing DNS Zone Files at NASA Headquarters. Over the past few years, I’ve been using RCS to manage my personal projects and work shell scripts. While I’m not the only one writing scripts and code, I believe I write the bulk of them. Anyway, I want to bring the team on board with managing scripts. Adding theirs into revision control and making it easy for the team to manage their and my scripts. Then we, everyone on the team and any new team members, can do nothing but benefit from managing each others scripts.

In order for me personally set up, I need to come up with a git/rcs Rosetta Stone. Not just commands but concepts. Taking the hacks I currently do to make RCS work in a team environment and bringing the team on board with documentation they can understand and use. This is done because I’m the one mainly using revision control for the scripts and I want to keep the history of the projects. Honestly though, it’s not super important to maintain the current history. Moving straight over to git with the existing files would also work fine and if problems occur, that’s what may happen. We’ll see as we progress.

There are plenty of git books and documentation but a google search doesn’t really identify a tutorial for moving from managing files in RCS to managing files in git. And while RCS is pretty simple in general, I do have a few scripts to help in managing my coding environment. Since it’s RCS, there are additional hacks to make it work the way I want it to work which make it more difficult for others on the team to manage it.

Background, environment setup first. Then some quick references in the rcs commands I’m using, followed by the scripts and data files I wrap around the rcs commands and a list of other scripts I use to manage the environment.

First off, you can poke at the various RCS books on line without too much trouble to see what RCS is and how it works. In general you have a directory of files. You can store versions in the current directory or create an RCS subdirectory and the RCS commands will store versions there. I prefer this method as it makes it cleaner when you’re viewing directories.

The environment:

code Directory




* code – A set of directories that contain checked out scripts and RCS subdirectories.
** [project] – The source code for that project
** make[project] – A script that creates the provisioning directory structure using the manifest and copying all the files from the static directory.
** manifest.[project] – A file that contains all the scripts that belong to the project. I use a couple of symbols to create directories and list what’s being done.

archive Directory

Archived data. Either code bits that aren’t useful any more (in a [project] directory) or older data files. I like to maintain the data imports for historical purposes.

static Directory

The most current files that aren’t code. Spreadsheet .csv files for importing, pictures, data files, etc.

stage Directory




* stage – The provisioning staging area. The make[project] script copies the code and all the static data for the project into this directory under a [project] subdirectory.
** [project] – The staged files for the project.
** exclude.[project] – Files and directories that aren’t to be synchronized.
** sync[project] – The script that uses rsync to synchronize the directory structure (code and static data) out to the various servers as required.

scripts Directory

A project’s working directory for

[project] – An individual’s working directory for any scripts.
* /var/www/html – My php working directory for three projects.

The sync[project] scripts are run every minute out of cron. They’re looking for a sync.[project] file which was created by the make[project] script.

The make[project] scripts are run every night at 1am. This ensures the scripts are up to date even if a sync wasn’t performed earlier.

The rcs commands I use:

* co – With the -l(ock) option, check out the current revision, place it in the current directory, and lock the revision. This prevents others from updating the script.
* ci – With the -u(nlock) option, check in, don’t lock the script, and leave the original in place for further editing.
* rlog – Show the history of the script.

There are a lot more options and commands but as it’s just me, I haven’t needed to explore too far. These three commands do everything I need.

The script wrappers I use:

I have several scripts I use to manage the environment. There are things I want to do to make sure work is complete and that all files are included when provisioning.

* check – This is a wrapper around the various check scripts. It greps out the comments from each check script and then lists all the scripts.
* checkdiff – Compares the passed script name with the master script to show you the differences between the two.
* checkin – Runs a ‘checkdiff’ command to show the differences between the two scripts, then runs the RCS ci -u command. As ci doesn’t show differences, I wanted to be able to see what had changed so I could properly document the change.
* checkinstall – Runs through the working directory and returns a file name if a script exists in the working directory but not in the master code directory.
* checkmanifest – Parses the manifest file and reports any files that are checked out and being worked on in the working directory.
* checkout – Sees if there’s a difference between the master script and the current script. If so, it simply exits. Otherwise it checks out the script and copies it into your working directory.
* checkrw – Basically checks for any script in the code directory that has ‘rw’ permissions indicating it’s been checked out. I use it to make sure I haven’t missed any scripts when checking in several.

In the code directory, I have a few scripts that help manage the code.

* findcount – This script essentially creates a list of all files in the project and writes it to a countall file. It also counts lines of code, comments, etc for statistical purposes.
** countall – The list of all files in the project.
** countall.backup – The make[project] script runs a diff against the countall and countall.backup files. If there’s a difference, the script exits without creating the staging directory. To correct, just copy the countall file over the countall.backup file.
** fixsettings – This script makes sure the settings.php configuration file exists in every directory.
** searchall – This script does a search of every file in the working directory for the passed keyword. Helpful when searching for all instances of a keyword.

Where do the files go?

* Inventory – The inventory project goes to three servers in /usr/local/httpd/htsecure.
* php Scripts – This goes to four servers including the Ansible server to build host files. In /usr/local/httpd/bin.
* Shell Scripts – These scripts go to just the Jumpstart server and then is sync’d across all 1,200 servers.
* Kubernetes – These scripts go to each of the Kubernetes clusters in /var/tmp/kubernetes.

As you can see, there are several scripts and the environment is really configured for one person.

Goal is to document how to create your own working server using VirtualBox. One problem with a tool like Vagrant is the work environment doesn’t permit access to outside sites without going through a proxy. So setting up an environment that can be used at work will be beneficial.

1. Create your working environment using VirtualBox. You need a working directory plus a web site for testing the two web projects.
2. Create a Source Code Control git server. I’m using ssh to retrieve projects.
3. Pull the project code to your working server.
4. Check out project code.
5. Check in project code.
6. Push the project code back to the git server.
7. Provision project code to the various servers as listed above.

But the documentation needs to reference the existing environment in order for the new commands and processes to be understood.

Posted in Computers | Tagged , , , | Leave a comment

The New Game Room

I’ve spent the past week moving boxes from the garage into the new house. Mainly into the new Library/Guest Bedroom, Studio/Media Room, and Game Room. Part of this was getting shelves in place and boxes stashed where appropriate. Part was unpacking boxes and putting books and games on shelves. And of course, assembling the gaming table. The table itself is actually a four 3’x3′ module but due to the space taken by the couch, only two squares were assembled. It works pretty well as a board gaming table though.

At the moment, the left three Kallax shelves on the left are full of board games along with the top two shelves on the far right Kallax shelves. The right most two Kallax shelves are all RPGs with the bottom three shelves of the last one holding RPGs. The last shelf unit is a bit more haphazard in part because I found several boxes of games when I was unpacking the Library.

For your viewing pleasure, here’s the Game Room:

As these were taken in the dark, here’s a pic from the original sale site:

And another one with a bit of precipitation 🙂

Posted in Gaming | Leave a comment