Friday, August 7, 2015

Embedding a Polymer (javascript) app in a Go binary.

For a while now I have been wanting to write a Polymer application with a Go backend. Then a couple of things came together that I just made me stay up a night or two hacking :)

Shoulder of Giants (components):

You can put your Javascript UI in there
TLDR: How about a github repo with all the sample code called gopolymer? Maybe a sample website (Accepting css / design PR's thanks)

So it is a bit of a dev ops dream of mine to ship a single executable file that contains both frontend and backend application. Build SCP run and done! (Go 1.5 works ;)


The process of deploying a static javascript web site is not difficult however it lacks my love for the simplicity of deploying a Go binary. Go has made this a lot easier however it still means passing a parameter at runtime with which directory to host the static files. If I could only scp up a single binary that contained everything the application needed including static files. In the past runtime permissions constrained which static files a user could load (which corresponded to features).
  • Deploying a binary that hosts static files without Docker, nginx, Apache ect.
  • Security constrained javascript functionaltiy
  • Dynamically display maintenance pages
  • Repeatable builds of both AJAX API and UI code that is easy to release and rollback
Your ideas?


Makefile: its what built the tools (Java, Python, Ruby, Javascript, Node ect) you use why not use it?

In recent memory I have used using the 3 different build tools for Java (ant, maven, gradle), 3 Javascript build tools (YUI Compressor, GWT, Gulp), rake, fabric and shell scripts. Make is refreshingly simple in comparission.

GB: Go repeatable builds yeah!

Dave is awesome I find keeping my code on the internets in 2015 is obvious thanks to Go. GB extends the simplicity and use of the internets to create a repeatable build. This blog seems to have the best docs:

Polymer: Its like Legos!

Javascript frameworks of late seem sound great "write less code" but in the end produce monolithic hard to reason with applications. I grew up with building things out of Legos and to this day I still find it is the best way to build software. The playing around with Polymer most reminds me of the ability create larger and larger components based on the standard of webcomponents. I mean the cwidget library is an element chart: !!!!

Missing Pieces: It is not all a bed of roses

  • GB dependency finder: Searches code and vendors packages (maybe exists could use better docs)
  • Polymer compressor: Compiles javascript / html to fewer files to improve load time (maybe exists just could use better docs)
  • Package and deployment: I have searched for a good tool to use for deployment pipeline. Thinking maybe I need to make a stab at it.


make build  # Ofcourse it uses make!!

./bin/gopolyd # Probably 


go run src/cmd/main.go


Thursday, May 21, 2015

McTest: A little http testing library I wrote for Go

Problem: This started out with someone asking me how to write a test in Go for http handler. Then I wanted to add some testing to my web services (JSON API specifically) and http handler wrappers (middleware).

Solution: McTest has a simple API for wrapping the boilerplate code to make testing easier.  In general it revolves around creating a request http.Request and creating a Mock response object mctest. NewMockTestResponse(*testing.T). Then calling the the handler passing it the request and response object. The response has a number of helper methods that allow for quick easy assertions of response code, response body (as a string) and JSON.

Goal: To have a simple library that removes the boilerplate code for validating the response body and response code that is magic free. httptest standard library has a response capture however it doesn't remove the boilerplate code I was looking to get rid of. Testify looked interesting however they have deprecated the http response capture for httptest standard library, Also I have found that not using magic mock libraries has forced me to use interfaces so that I have cleaner implementation.


Thursday, May 15, 2014

My Opinionated Guide To Go (golang)


The modern software environment with internal and external services has become a distributed system that is constantly changing (continuous deployment). A simpler language based on the unix philosophy(ies) that supports concurrency, expects errors, leverages the internet and ships with great tools is a wonderful thing!

Unleash the Unix!

Talking with dev ops people these days I get the sense that stacks look roughly like this:

Code -> WAR -> Classloader -> Micro Kernel (JBoss) -> JVM -> Docker -> Operating System -> Hardware

Code -> Rails -> RVM -> Docker -> Operating System -> Hardware

Code -> Framework -> PVM -> Docker -> Operating System -> Hardware

Code -> Node.js -> V8 -> Docker -> Operating System -> Hardware

Code -> Operating System -> Hardware

There was a time in the 90’s when most software was developed on an operating system that was not Unix. Having a virtual machine be it Java, Python, Ruby or others that tasted and smelled like Unix was a wonderful thing back then. Now we are in a time when Linux machines are the norm, it seems overly complex to have so many abstractions above the operating system. The security and resource management of Linux (Unix processes) is wonderful.

The vm set of languages with N-tiered stacks (and stacks of n tears) may also be solving the packaging problem. Did I mention that Go compiles to a static binary? So if Docker and friends are being used to help aid packaging and deployment, then Go’s single binary makes for less tears.

Native Lightweight Concurrency

Quad core phones, web service distributed data sources and network IO are not a problem of the distant future. Multithreaded programming done well is hard, 99% of the time I want a safe way to pass messages to concurrently running code. 1% of the time I need to have lower level access. Go solves these concerns natively in a very elegant and simple way.

Errors are not exceptional


With a modern application using web services - internal or external - creates a lot of network errors that are not exceptional. Another common complaint I hear about Go is that the code is littered with explicit error handling. This is surprising because I have found that building a robust application is mostly about handling errors properly. Errors are not an exception and Go treats them like first class citizens, exposing them to the developer where they happen.

The Internets is a great place for lol cats and the source codz


One of the things I get a lot of questions about when I tell other programmers I write Go is why the dependency management is in the source code. This confuses me because we (modern software developers) keep our code in repositories that are on the internet or intranet at least so it has a url right? If we need a specific tag or version we can just use a source code link in our code to refer to the dependency, right? What is strange is that it is easy to just keep a tar.gz of the dependencies with the source code just like any legacy dependency system in another language. I am so excited that there is no large XML file or programming language to manage dependencies and packaging. However it is common to use a Makefile for Go and nothing excludes developers from using XML or any other time consuming build system to compile Go. Usually I point out that you could add the one line compile as an external exec in any build tool but that seems like extra work to me.

The second part of complaints I hear a lot is that unused dependencies failing to compile annoys a lot of developers. As a developer who has worked on many code bases in the 100K (scripting and statically typed compiled) dependency management is a real pain. From a security standpoint trying to make sure all the dependencies are properly updated / patched is a big deal. Removing dependencies especially in a scripting language becomes really challenging as the code base grows. Keep it clean and keep it simple is very attractive after having a few of those headaches.

Idiomatically Simple


The language fits in my head which allows me to focus more brain power on solving the problem at hand and less on syntax. Go has very little magic, it looks and reasons pretty much exactly as you expect it to. The systems I work with tend to be very complex with lots of data storage, services and constant change.

Go tools chain

gofmt: This has saved me from having so many conversations I don’t care to have anymore.

godoc: Awesome command line tool for querying the language and by adding -http will start a server so the docs can be surfed also.

go test: Built-in testing package is great!

Other Tools

export GOPATH = $HOME

This just saves so much time. It means all the source code will be in the import path in the developer's home directory. It makes it very unsurprising what version of software is being used. It also makes it easy to make local changes to dependencies locally. It also means $HOME/src/$USERNAME/$PACKAGE. It is kinda like a convention.

glog for logging

This is the best library I have found besides the one that comes default. The only tricky part is to get it to log to stdout you have to pass the parameter -logtostderr.

fresh is a wonderful tool for automatically compiling

Since I work a lot on web apps and it is nice to not have to compile from the commandline I started using fresh. However since I use glog I need a way to pass the -logtostderr and fresh did not support it. That was ok because in a couple minutes of looking at the code, I wrote a hack in a couple minutes (I submitted a PR) so that I could use fresh with command line arguments.

Opinionated Ending

Choosing  a programming language is choosing a set of compromises. When choosing Go, the two compromises that I notice most are: there isn’t a library for everything (yet) and coding Go take a bit more time than scripting. In no way do I use Go for everything, however I have found it more and more attractive for writing programs that can work together. I have noticed Go has become popular with the dev ops and system admins lately. I wonder if it is because in a complex application world, for me Go is like a glass of ice water in hell.

Discussion @ hacker news and reddit.

Monday, February 3, 2014

crypter: my attempt at improving security

Legolas: It is not the eastern shore that worries me. A shadow and a threat have been growing in my mind. Something draws near, I can feel it.... Orcs or the NSA? Either way the lack of online security has been something that has been a growing concern for me. I have been researching how to improve security of the software systems I work on. Initially my main motivation has been providing the same level security for the users that I would expect to have for the products I use. It is surprising how much FUD there is in computer security industry. Fear sells computer "security" products. This fear has seem to infiltrate software development culture. The concept that there are "secure" systems and insecure systems is that the "secure" systems have highly trained group of mathematician keep the system "secure". In reality, there is a spectrum of security in which complexity of security does not equal greater security. Just as almost any user can use sftp (ssh), developers can use simple API's that implement the right encryption to increase the security of our apps.

A lot of this fear is based on improper implementation of security. However exploiting a security hole in an encryption algorithm is a lot harder than it may sound. Even a bad implementation that leaves a security hole is better than no encryption at all because the bar is so much higher. My goal is just to constantly increase the bar of security of the systems I work on. Even though I think it is going to be challenge to defend against someone like the NSA (who have unlimited resources) we can build reasonable secure systems to keep out the Orcs.

Crypter is a tool I built to help encrypt data from the commandline. What I wanted was to be able to have a single binary that could decrypt using a symmetric key encryption. Originally my goal was that any code or data that was transmitted over the wire was encrypted. This would be double encrypted using ssl or ssh and symmetric key encryption.

Practically this means using crypter to encrypt my code and push it to a secure s3 bucket. Then the servers that are deploying it download it from s3 decrypt it and deploy the code (or binaries). Also this is how I encrypt backups and before I send any data over the wire to production servers I try to make sure it uses crypter. What is nice about crypter is it compiles to a single binary and does not have any dependencies. So for doing devops tasks with a fresh server instance you can scp crypter to the new server and the data and then decrypt the data on the server with no dependencies. Once the setup script is decrypted you can run it to finish the rest of the setup of the server.
As I said I am trying to improve security so please send me any feedback, bugs or suggestions.

Wednesday, January 15, 2014

mdserve a Markdown binary utility for written in Go

Problem: (markdown docs) formatting

Markdown is a pretty good renderer of docs so I try to use it as much as possible. I notice that often what might look fine in vim doesn't render very well in HTML once it gets pushed to github or rendered by the doc generator. Commonly days or even weeks later touching up docs that with simple formatting and taking extra time.

Solution: mdserve http server for a markdown file

Basically I am lazy. I could generate the html then open it in a browser. However each time I was going to change the file I would have to rerun generate the html and reload the browser page. What I wanted is a program that I could just reload the browser and it would redisplay my changes. This isn't exactly like what github or the a doc generator would do but it catches all of the formatting issues I normally have.

Give it a try:

go get

Friday, January 10, 2014

codap a library I created to port some concurrent access patterns from Go to Python

As I was building an application that used a couple different data stores. MongoDB and S3 there came times that I needed to optimize the performance of a couple operations. It seemed that accessing the IO concurrently boiled down to 3 very simple patterns.

Key / Value (Dictionary, Map ect): It was very common that I would need to get data from data stores or third party services put it into a dictionary to be rendered with a template or marshaled to JSON for a web service. This pattern I used everywhere and is even used to implement the Order List pattern.

Ordered List: In some cases the data that was being retrieved needed to maintain its ordering.  Usually this was because it was sorted in some way.

First Reply: In other cases especially when dealing with many large files that then needed to get processing the order didn't matter. I either just needed the first one that came back or to get all of the files and start processing those files as soon as possible. This came in very handy specifically when I was doing encryption / decryption and compression. All the files where going to end up encrypted and compressed on the same file or stream so it didn't matter which one was first just that it happened as fast as possible.
The complex part was measuring performance. The database was consistent however both S3 and third party services had a large range in response times. Network traffic in AWS is what I guessed. In general though I found that this it was very beneficial if the IO operations where large or there where many operations even if they are small. The large operations could read or write happened concurrently so the tended to only last as long as the largest operation took thus for 3 large IO operations it only took 1/3 or 1/2 the time. With many small operations the performance was more complex however in general I would see 40% improvement on performance.

Please try it out and give me feedback and any notes you have. Works in Python 3!

pip install codap

Github: codap

Tuesday, January 7, 2014

Go Web Service Testing meets the Gorilla First Iteration

After making some headway with McTest in my previous post. I was not able to test any of the routing rules because I use Gorilla Mux which is an awesome library. A bug could live in the routing code so it was time to tackle the 800lb Gorilla (if you will). Since my goal was to try and get 100% test coverage for this simple web service it was soup to nuts.

First lets take a look at the request handler code:

InitRest function had the map code. This will need to get called before any test code can run.

Now the test code is an entirely different beast (Gorilla):

There are some funkyness getting this working. It is far from elegant or nice code. This is mostly a hack after I read the Gorilla context code. The first hack is the constant bits that are set at the top of the handler code. This is needed to associate with the request. The second is instead of using the document function mux.Vars we need to use context.Get. Not the end of the world but I would like it to be a lot cleaner so if anyone has any suggestions on cleanest ways to fix this let me know.