Using secrets with Google AppEngine

For side project #4323194 (implement a chrome extension that looks like this: 👂 and turns red when someone mentions you on GitHub), I needed to implement oauth from AppEngine to GitHub. As I’ve mentioned before, oauth is my nemesis, but for this project there didn’t seem to be a great way around it. It actually wasn’t as bad as I remember… maybe I know more about HTTP now? Either way, I only messed up ~16 times before I got it authenticating properly.

When you want an app to work with the GitHub API, you go to GitHub and set up a new application, tell it what URL it should send people to after login, and it gives you a “secret key” that no one else should know. Then you simply implement oauth’s easy, intuitive flow:

  1. Redirect a user to https://github.com/login/oauth/authorize when you want them to log in.
  2. GitHub will ask the person to log in, then redirect back to the URL you gave it when you set up your app.
  3. You POST the secret key you got to https://github.com/login/oauth/access_token.
  4. GitHub replies with an access token, which you can then use in the header of subsequent requests to access the API.

The problem here is #3: the secret key. In ye olde world of “I have a server box, I shall SSH into it and poke things,” I would simply set an environment variable, SOOPER_SECRET=<shhh>, then get that from my Java code. However, AppEngine prevents that sort of (convenient) nonsense.

So, I poked around and the thing I saw people recommending was to store the value in the database. This is an interesting idea. On the downside, it’s approximately a zillion times slower than accessing an environment variable. On the other hand, this is a login flow that makes three separate HTTP requests, I don’t think a database lookup is going to make a huge difference. On the plus side, it “automatically propagates” to new machines as you scale.

So I began working on a class to store the secret. Requirements:

  • Easy to set: I want to be able to visit a URL (e.g., /secrets) to set the value.
  • Difficult for others to set: I don’t want users to be able to override keys I’ve created, or create their own keys.
  • Perhaps most importantly: difficult for me to unintentionally commit to a public GitHub repo. I am super bad at this, so I need a completely brain-dead way to never, ever have this touch my local code, otherwise it will end up on GitHub.

What I decided on:

Create a servlet (/secrets) that takes the key/value to set as a query parameter. The servlet will only set the secret key for keys I’ve defined in code (so visitors can’t set up their own secret keys) and will only set the secret key if it doesn’t exist in the database, yet. Thus, after the first time I visit /secrets, it’ll be a no-op (and actually can be disabled entirely in production). Because the secret is given as a query parameter, it never hits my code base. It will appear in request logs, but I’m willing to live with that.

What this looks like in an AppEngine app:

<!-- web.xml - add handling for this URI -->
    <servlet>
        <servlet-name>secrets</servlet-name>
        <servlet-class>com.meab.oauth.SecretDatastore</servlet-class>
    </servlet>
    <servlet-mapping>
        <servlet-name>secrets</servlet-name>
        <url-pattern>/secrets</url-pattern>
    </servlet-mapping>

And the Java code does some URI parsing and then:

  private void findOrInsert(String key, String value) {
    Entity entity = getEntity(key);
    if (entity != null) {
      // No need to insert.
      return;
    }
 
    entity = new Entity(ENTITY_TYPE);
    entity.setProperty("key", key);
    entity.setProperty("value", value);
    datastore.put(entity);
  }

And the nice thing about using Google’s AppEngine datastore is that it’s easy (relatively) to write tests for all this.

You can check out the sources & tests at my git repo. (Note that the extension doesn’t actually work yet, right now it just logs in. I’ll write a followup post once it’s functional, since I think this might be relevant to some of my readers’ interests.)

Low-fat Skylark rules – saving memory with depsets

In my previous post on aspects, I used a Bazel aspect to generate a simple Makefile for a project. In particular, I passed a list of .o files up the tree like so:

  dotos = [ctx.label.name + ".o"]
  for dep in ctx.rule.attr.deps:
    # Create a new array by concatenating this .o with all previous .o's.
    dotos += dep.dotos
 
  return struct(dotos = dotos)

In a toy example, this works fine. However, in a real project, we might have tens of thousands of .o files across the build tree. Every cc_library would create a new array and copy every .o file into it, only to move up the tree and make another copy. It’s very inefficient.

Enter nested sets. Basically, you can create a set with pointers to other sets, which isn’t inflated until its needed. Thus, you can build up a set of dependencies using minimal memory.

To use nested sets instead of arrays in the previous example, replace the lists in the code with depset and |:

  dotos = depset([ctx.label.name + ".o"])
  for dep in ctx.rule.attr.deps:
    dotos = dotos | dep.dotos

Nested sets use | for union-ing two sets together.

“Set” isn’t a great name for this structure (IMO), since they’re actually trees and, if you think of them as sets, you’ll be very confused about their ordering if you try to iterate over them.

For example, let’s say you have the following macro in a .bzl file:

def order_test():
  srcs = depset(["src1", "src2"])
  first_deps = depset(["dep1", "dep2"])
  second_deps = depset(["dep3", "dep4"])
  src_and_deps = srcs | first_deps
  everything = second_deps | src_and_deps
 
  for item in everything:
    print(item)

Now call this from a BUILD file:

load('//:playground.bzl', 'order_test')
order_test()

And “build” the BUILD file to run the function:

$ bazel build //:BUILD
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: dep1.
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: dep2.
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: src1.
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: src2.
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: dep3.
WARNING: /usr/local/google/home/kchodorow/test/a/playground.bzl:7:5: dep4.

How did that code end up generating that ordering? We start off with one set containing src1 and src2:

Add the first deps:

And then create a deps set and add the tree we previously created to it:

Then the iterator does a postorder traversal.

This is just the default ordering, you can specify a different ordering. See the docs for more info on depset.

9 years of blogging have totally been worth it

Worth of Web is kind of a neat site:

Oh well. It’s been worth it to me.

Aspects: the fan-fic of build rules

Aspects are a feature of Bazel that are basically like fan-fic, if build rules were stories: aspects let you add features that require intimate knowledge of the build graph, but that that the rule maintainer would never want to add.

For example, let’s say we want to be able to generate Makefiles from a Bazel project’s C++ targets. Bazel isn’t going to add support for this to the built-in C++ rules. However, lots of projects might want to support a couple of build systems, so it would be nice to be able to automatically generate build files for Make. So let’s say we have a simple Bazel C++ project with a couple of rules in the BUILD file:

cc_library(
    name = "lib",
    srcs = ["lib.cc"],
    hdrs = ["lib.h"],
)
 
cc_binary(
    name = "bin",
    srcs = ["bin.cc"],
    deps = [":lib"],
)

We can use aspects to piggyback on Bazel’s C++ rules and generate new outputs (Makefiles) from them. It’ll take each Bazel C++ rule and generate a .o-file make target for it. For the cc_binary, it’ll link all of the .o files together. Basically, we’ll end up with a Makefile containing:

bin : bin.o lib.o
	g++ -o bin bin.o lib.o
 
bin.o : bin.cc
	g++ -c bin.cc
 
lib.o : lib.cc
	g++ -c lib.cc

(If you have any suggestions about how to make this better, please let me know in the comments, I’m definitely not an expert on Makefiles and just wanted something super-simple.) I’m assuming a basic knowledge of Bazel and Skylark (e.g., you’ve written a Skylark macro before).

Create a .bzl file to hold your aspect. I’ll call mine make.bzl. Add the aspect definition:

makefile = aspect(
    implementation = _impl,
    attr_aspects = ["deps"],
)

This means that the aspect will follow the “deps” attribute to traverse the build graph. We’ll invoke it on //:bin, and it’ll follow //:bin‘s dep to //:lib. The aspect’s implementation will be run on both of these targets.

Add the _impl function. We’ll start by just generating a hard-coded Makefile:

def _impl(target, ctx):
  # If this is a cc_binary, generate the actual Makefile.
  outputs = []
  if ctx.rule.kind == "cc_binary":
    output = ctx.new_file("Makefile")
    content = "bin : bin.cc lib.cc lib.h\n\tg++ -o bin bin.cc lib.cc\n"
    ctx.file_action(content = content, output = output)
    outputs = [output]
 
  return struct(output_groups = {"makefiles" : set(outputs)})

Now we can run this:

$ bazel build //:bin --aspects make.bzl%makefile --output_groups=makefiles
INFO: Found 1 target...
INFO: Elapsed time: 0.901s, Critical Path: 0.00s
$

Bazel doesn’t print anything, but it has generated bazel-bin/Makefile. Let’s create a symlink to it in our main directory, since we’ll keep regenerating it and trying it out:

$ ln -s bazel-bin/Makefile Makefile 
$ make
g++ -o bin bin.cc lib.cc
$

The Makefile works, but is totally hard-coded. To make it more dynamic, first we’ll make the aspect generate a .o target for each Bazel rule. For this, we need to look at the sources and propagate that info up.

The base case is:

  source_list= [f.path for src in ctx.rule.attr.srcs for f in src.files]
  cmd = target.label.name + ".o : {sources}\n\tg++ -c {sources}".format(
      sources = " ".join(source_list)
  )

Basically: run g++ on all of the srcs for a target. You can add a print(cmd) to see what cmd ends up looking like. (Note: We should probably do something with headers and include paths here, too, but I’m trying to keep things simple and it isn’t necessary for this example.)

Now we want to collect this command, plus all of the commands we’ve gotten from any dependencies (since this aspect will have already run on them):

  transitive_cmds = [cmd]
  for dep in ctx.rule.attr.deps:
    transitive_cmds += dep.cmds

Finally, at the end of the function, we’ll return this whole list of commands, so that rules “higher up” in the tree have deps with a “cmds” attribute:

  return struct(
      output_groups = {"makefiles" : set(outputs)},
      cmds = transitive_cmds,
  )

Now we can change our output file to use this list:

    ctx.file_action(
        content = "\n\n".join(transitive_cmds) + "\n",
        output = output
    )

Altogether, our aspect implementation now looks like:

def _impl(target, ctx):
  source_list= [f.path for src in ctx.rule.attr.srcs for f in src.files]
  cmd = target.label.name + ".o : {sources}\n\tg++ -c {sources}".format(
      sources = " ".join(source_list)
  )
 
  # Collect all of the previously generated Makefile targets.                                                                                                                                                                                                                                                                  
  transitive_cmds = [cmd]
  for dep in ctx.rule.attr.deps:
    transitive_cmds += dep.cmds
 
  # If this is a cc_binary, generate the actual Makefile.                                                                                                                                                                                                                                                                      
  outputs = []
  if ctx.rule.kind == "cc_binary":
    output = ctx.new_file("Makefile")
    ctx.file_action(
        content = "\n\n".join(transitive_cmds) + "\n",
        output = output
    )
    outputs = [output]
 
  return struct(
      output_groups = {"makefiles" : set(outputs)},
      cmds = transitive_cmds,
  )

If we run this, we get the following Makefile:

bin.o : bin.cc
	g++ -c bin.cc
 
lib.o : lib.cc
	g++ -c lib.cc

Getting closer!

Now we need the last “bin” target to be automatically generated, so we need to keep track of all the intermediate .o files we’re going to link together. To do this, we’ll add a “dotos” list that this aspect propagates up the deps.

This is similar to the transitive_cmds list, so add a couple lines to our deps traversal function:

  # Collect all of the previously generated Makefile targets.                                                                                                                                                                                                                                                                  
  dotos = [ctx.label.name + ".o"]
  transitive_cmds = [cmd]
  for dep in ctx.rule.attr.deps:
    dotos += dep.dotos
    transitive_cmds += dep.cmds

Now propagate them up the tree:

  return struct(
      output_groups = {"makefiles" : set(outputs)},
      cmds = transitive_cmds,
      dotos = dotos,
  )

And finally, add binary target to the Makefile:

  # If this is a cc_binary, generate the actual Makefile.                                                                                                                                                                                                                                                                      
  outputs = []
  if ctx.rule.kind == "cc_binary":
    output = ctx.new_file("Makefile")
    content = "{binary} : {dotos}\n\tg++ -o {binary} {dotos}\n\n{deps}\n".format(
        binary = target.label.name,
        dotos = " ".join(dotos),
        deps = "\n\n".join(transitive_cmds)
    )
    ctx.file_action(content = content, output = output)
    outputs = [output]

If we run this, we get:

bin : bin.o lib.o
	g++ -o bin bin.o lib.o
 
bin.o : bin.cc
	g++ -c bin.cc
 
lib.o : lib.cc
	g++ -c lib.cc

Documentation about aspects can be found on bazel.io. Like Skylark rules, I find aspects a little difficult to read because they are inherently recursive functions, but it helps to break it down (and use lots of prints).

That’s senior programmer to you, buddy

After about a decade of professional programming, I have finally gotten promoted. For the first time. This is a weird industry.

Regardless, I am now a “Senior Software Engineer.” Woo!

Thinking about it, this has been a goal of mine for a long time. Now that I’ve achieved it, I’m not sure what’s next.

“…and Alexander wept, for there were no more worlds to conquer.”

The Haunted Homesteader

Andrew and I have always loved old places. Optimally, we’d like to live in a wizard’s tower on top of a mountain. However, we’d be willing to settle for a castle with a thousand acres of land. More realistically, we’d like an old place with a couple of acres. That you can reach without a car from NYC (I said more realistically, not actually realistically).

So, sometimes I browse the real estate listings, especially near the train stations along Metro North and one day I noticed something unusual. There was a place that was being sold right next to the train station. It was 10 acres of property. They were asking less than $1 million.

Okay, those were the good things. There were… a couple downsides. It was old (good) and had obviously been abandoned for years (not so good). It was missing certain crucial elements like windows. And a roof. It needed a completely new septic system and well replacement (well water in combination with septic tank problems: eww), needed new wiring, and god knows what else. It was a “historical property,” so any repairs we made had to be okay-ed by a historical accuracy board and materials would probably be exorbitantly expensive: no off-the-shelf windows from Home Depot. On top of all that, the walls were literally made of asbestos, so either we pay an exorbitant amount to have all of the asbestos removed, or pay an exorbitant amount for each repair we did because everyone would have to wear spacesuits and take crazy precautions.

So, I said, “Let’s just go up on the train an take a look. So I can get it out of my system.”

Andrew pretended to believe me and off we went. We got off the train and… the property was right there. Like, 5-minute stroll up the hill from where we got off the train. Note the “up-the-hill” part: this thing was basically invisible from every angle. We’d gone hiking from this train stop a hundred times and never seen it before. It was on a bluff overlooking the Hudson. We carefully circled the property, the ridiculously large property, trying to get a clear view of the place. There was no actual road leading to it, it was just… abandoned by time, on a bluff overlooking the Hudson. I was done for.

We continued to circle it and the “backyard” (the part not overlooking the Hudson, did I mention it fucking OVERLOOKS THE HUDSON?!) melts into reserved state land, so it’ll never be developed. We’d have a forest in our backyard for perpetuity. We walked down one of the trails through the woods in the “backyard” and sat down on a big rock overlooking a waterfall. We discussed, and negotiated, and fantasized. We started off agreeing that we’d both be comfy offering 1/5 the asking price. A few hours later, our butts were freezing, Domino had crawled into Andrew’s lap, and we had negotiated ourselves up to, “well, the asking price is sort of reasonable…” We’re terrible negotiators.

So, we contacted the real-estate agent so we could see the inside of the house. She took us on a tour and, in some ways, the place was amazing: floor-to-ceiling bay windows with views of the river, fireplaces in every room, and more rooms that we knew what to do with. In other ways, it was… not so amazing. For example, it was built before indoor plumbing, so the original floor plan had not accounted for things like bathrooms. Nor closets, apparently that wasn’t a thing. It was built for rich people in the 1800s, so the kitchen was in the basement (keep the help out of the way). There was obviously no electricity, so none of the ceilings have wiring for light fixtures. The owners built an addition with plumbing & electric in the 20s, but given its size and shape, it’s a bit quirky. For instance, they added an awesome “secret door” bookcase that swings open to reveal… a bathtub in a closet.

After seeing the inside, we were in love, but the love was tempered by the desire to not have to work until we died restoring the place. We talked to a some contractors. We talked to our friends. We talked to our parents.

And… we decided against it. I’m a bit heartsick over it, but it’s just 10 years too early for us to be able to dedicate that kind of time to fixing up a place. Doing the mature, responsible thing sucks.

Now we’re mainlining Ask This Old House. When the time comes, we’ll be so ready.

The living room.

The living room.

Recruiting review

Just got this message from a recruiter:

I know I recently reached out to you, but seriously, you’re the cat’s meow and I can see you being the perfect fit for [company]. Your current experience at Google is spot on with what our Talent Team is looking for.

[Description of company]

If I am way off base in my analysis of your profile, please let me know. However, if there is a small chance that I may have hit the nail on the head – I’d love to discuss the opportunity to join their team.

[Sign off]

deTECHtive | Talent Acquisition Manager

Points for actually describing what the company does. Points off for:

  • Using more analogies than I could swing a dead cat at.
  • deTECHtive
  • Being a Googler == what they’re looking for.

All in all, I rate it two resumes out of five: nothing egregious, but nothing appealing, either.

Using AutoValue with Bazel

AutoValue is a really handy library to eliminate boilerplate in your Java code. Basically, if you have a “plain old Java object” with some fields, there are all sorts of things you need to do to make it work “good,” e.g., implement equals and hashCode to use it in collections, make all of its fields final (and optimally immutable), make the fields private and accessed through getters, etc. AutoValue generates all of that for you.

To get AutoValue work with Bazel, I ended up modifying cushon’s example. There were a couple of things I didn’t like about it, since I didn’t want the AutoValue library to live in my project’s BUILD files. I set it up so it was defined in the AutoValue project, so I figured I’d share what I came up with.

In your WORKSPACE file, add a new_http_archive for the AutoValue jar. I’m using the one in Maven, but not using maven_jar because I want to override the BUILD file to provide AutoValue as both a Java library and a Java plugin:

new_http_archive(
    name = "auto_value",
    url = "http://repo1.maven.org/maven2/com/google/auto/value/auto-value/1.3/auto-value-1.3.jar",
    build_file_content = """
java_import(
    name = "jar",
    jars = ["auto-value-1.3.jar"],
)
 
java_plugin(
    name = "autovalue-plugin",
    generates_api = 1,
    processor_class = "com.google.auto.value.processor.AutoValueProcessor",
    deps = [":jar"],
)
 
java_library(
    name = "processor",
    exported_plugins = [":autovalue-plugin"],
    exports = [":jar"],
    visibility = ["//visibility:public"],
)
""",
)

Then you can depend on @auto_value//:processor in any java_library target:

java_library(
    name = "project",
    srcs = ["Project.java"],
    deps = ["@auto_value//:processor"],
)

…and Bob’s your uncle.

You do you

I’m a little tired and depressed this week. However, this was a very inspiring speech Neil Gaiman gave to new grads of an art school:

Neil Gaiman Addresses the University of the Arts Class of 2012 from The University of the Arts (Phl) on Vimeo.

I think that his point about doing what you love, regardless of the money, holds doubly true for programmers. We are extremely lucky in that, unlike artists, we can make an okay salary nearly anywhere. We might as well work on things that make us happy.

Snail Spam

When I started blogging, I called my blog “Snail in a Turtleneck,” a cute image that Andrew & I came up with. I drew up my mascot:

A bemused snail, wearing a turtleneck.

A bemused snail, wearing a turtleneck.

and I began posting cartoons I had drawn. I quickly became bored of doing cartoons, and found I was more motivated to put up technical blog posts. Most of my initial readers were coworkers and MongoDB users. When Andrew and I got married, I told my teammates the day before that I’d be out the next day, as I was getting married (we got married at the city clerk’s, so it wasn’t a big production). When I got back to work, I found this at my desk:

A stuffed snail, a snail tape dispenser, and a very lovely bouquet (that was entirely free of snails).

A stuffed snail, a snail tape dispenser, and a very lovely bouquet (that was entirely free of snails).

I was very touched by their thoughtfulness: Andrew and I still have the stuffed snail and I brought the tape dispenser along to Google (where, unfortunately, it was later lost during an intra-office move).

However, as MongoDB gained popularity, some of my posts became very popular and I began to regret the name: customers seemed a little embarrassed to mention they had gotten advice from it and a lot of people didn’t realize that I was actually behind it. I purchased kchodorow.com, set up permanent redirects, and basically stopped referencing “Snail in a Turtleneck.” After a couple of years, I let the domain name lapse.

Last week, someone told me that my site had been hacked. I was confused, until they told me it was snailinaturtleneck.com. I took a look and, bizarrely, someone seems to have taken a dump of my site circa 2011, put spam on the index, and put it up at snailinaturtleneck.com, complete with my artwork, cartoons, etc. The domain was registered through a privacy protection service, so I guess the next step is sending a DCMA takedown notice to the registrar.

Who does this? (I mean, spammers, but… so annoying.)

kristina chodorow's blog