Telcopunk

So we’ve had steampunk and dieselpunk, cyberpunk and seapunk.

My I’m going to call my aesthetic ‘Telcopunk’.

I favor practicality.

I believe in universal service and universal access.

Utilitarianism rules.

Research is important.

Unions are good.

Work locally. Think globally.

Distance is expensive.

Connecting people is important.

Information is and should be a primary concern of industry.

Designs should be made for durability.

An important job is building and maintaining infrastructure.

Privacy — but not security — is a core value, and standards of conduct reflect this.

Jeans. Work boots. Gloves.

Conceive things, then make them.

'How do I get good at programming?', I'm asked.

Read. Write. Publish. Repeat.

And in general, people’s opinions are meaningless without data to back them up. So ignore the haters.

Ignore the people saying you’re doing it wrong unless your job depends on it or they have good reasons.

People will tell you “javascript will die” or “ruby is just a fad”

Ignore the haters.

But also ignore the haters who say “java is stupid.”

And ignore the haters who say “OO is wrong”

And ignore the ones who say “OO is the only way” Or “OO is the best way” too.

But listen to the people who say “have you considered a different approach?”. Those are the good ones.

Strong suggestions for structurally combatting online harassment

Craig Newmark asked for suggestions and here’s some things I came up with:

  • Create block functions that actually work and completely block all interaction with a user.
  • Create a mute function that doesn’t get tangled in block.
  • Respond to abuse reports, generating at minimum an inter-user block, but that when they actually involve any kind of escalation by the abuser, a block of that user from the service (or other highly quarantining action).
  • Encourage use of pseudonyms rather than complete anonymity, if only to encourage a stable handle to block by.
  • Spam-fighting-like statistical models to detect outlier behavior — repeated first contacts by someone who’s been reported as harassing is one particularly significant sign. Being proactive and confirming with the harassed user might even make sense. “Is @username bothering you?”
  • Allow communities to segment when possible, rather than encouraging all users to share one single graph.
  • At least three-level privacy controls per account: Public, initial contacts restricted to friends, and all contact restricted to friends.
  • Create transparent policies and processes, so we can know how effective the service will be in supporting us if harassed, rather than shouting into the void, wondering if anyone actually reads these reports. If the policies or processes change, say something!
  • Do use decoy selections in report abuse forms, but keep it simple: “This is annoying” vs “this is dangerous” can be differentiated, and the decisions about how to handle those should be different.
  • Don’t patronize the people you’re trying to protect. Leave choices in the hands of those needing protection when it’s possible. For tools for protection that have downsides (social cost, monetary cost, opportunity cost), let those needing protection opt in or opt out. If the tools are independent of each other, let them be chosen à la carte.

And a rule of thumb:

If you spend less time fighting harassment than you do fighting spam, your priorities are wrong. If you take spam seriously and don’t take harassment seriously, you’re making it worse.

An unofficial mission statement for the #node.js IRC channel

This is the mission statement I seek to uphold when I provide support on the Freenode #node.js channel.

To support users of node.js and their related projects to create a collaborative,
creative, diverse, interested, and inter-generational sustainable culture of
programmers of all skill levels, with room and encouragement to grow.

One of the great criticisms of IRC channels for software support is that they’re often the blind leading the blind. Experts have little direct incentive to jump in with half-formed questions, and it takes some real skill to elicit good questions that can be answered accurately. There’s some incentive for some members to use questions to opine their favorite tools, and to show off clever answers not necessarily in the best interests of the person asking.

The other problem is times of day — American nights and weekends have a lull, and questions asked then are often left to the void. Hard to answer questions — vague and incomplete ones especially — are the easiest to ignore. Let’s do the hard work to encourage good discussion, even among the less carefully asked, hurried questions.

We can do this and be unusual among technical channels. We’ve the critical mass to do it, and we’ve a great culture to start with. Let’s keep it up!

A Tale of Two Webs

originally posted on Medium

There’s a sharp divide between the people who make the Web, all of us, everywhere, and Silicon Valley Tech.

It’s a cultural divide I’ve seen come up again and again and again in discussions of tech culture.

On one side, we have the entitled, white frat-boy mentality of a lot of Silicon Valley start-up companies, with a culture going back years in a cycle of venture capital, equity, buy-out or IPO, repeat; a culture often isolated from failure by the fact that even the less amazing exits are still a solid paycheck. I suggest that this grew out of American industrial culture, the magnates of the nineteenth century turned inward into a mill of people all jockeying to be the next break-out success.

On the balance, we’ve the people who make the Web outside those silos. The lone designer at a traditional media publisher, doing the hard work to adapt paper print styles to the rapid publishing and infinite yet strangely shaped spaces of browser windows. The type designers who’ve now made their way out of lead blocks and work in Bézier curves. The scientist at CERN who realized that if every document had an address, scientific information would form a web of information. They don’t labor in a Tech Industry, they labor in their industries — all of them — connected by a common web.

In media, it appears as one giant “Tech industry”, and perhaps this is bolstered by the fact that a great number of people don’t know what a lot of us do — a software developer and a designer are so much the same job to someone who’s not paying attention to the details.

And yet, on Wednesday, a great many people turned their Twitter avatars purple in support of a family who’s lost a child to cancer. Looking over who they were, something dawned on me: They were some of the best and brightest on the Web. Authors, developers, designers. The people who know and know of @meyerweb are the people who make the Web. This is the Web I care about, have always cared about. It’s the web of caring and sharing, of writing and collaborating. We take care of our own.

In skimming over the people who’ve gone purple, I notice one thing: The bios that list locations say things like “Cleveland, OH”, “Chicago, IL”, and “Cambridge, MA”. “Bethesda, MD”, “Phoenix, AZ”, “Nashville, TN”. “1.3, 103.4”. Their titles are “type designer”, “map maker”, “standards editor”, “librarian”, “archivist”.

And far, far down the list, a few “San Francisco, CA” and “San Jose, CA”, “Software Developer” and “Full-stack engineer”.

2112

You do occasionally visit Boston Public Library, yes?
If not, get on it! You were raised in and on libraries. They are in your blood!

You called me out rightly on that one! I’ve never actually been inside the BPL
— it’s on the Green line, the cantankerous part of the subway — and I just
haven’t been out there. Somerville’s is pretty limited — not nearly as big as
Englewood’s library, and it’s got a selection that’s definitely not aimed at
me.

I just saw the Arlington library night before last, actually, and it’s this big
huge modern building, it reminds me of the Koelbel library we used to go to.
It’s the first one I’ve been so excited to try to go to in a while.

It’s funny that you bring this up right now. I’ve been reading article after
article for the last year, but especially in the last weeks by librarians and
book publishers and authors talking about what the role of libraries are in a
world where it’s relatively easy to get ahold of the actual text anywhere and
anywhen.

There’s a whole argument that libraries are obsolete; a lot of this came out of
the crazy world of the California tech scene, where there’s this huge
Libertarian ‘government is evil, technology will solve all our woes’ thinking,
but that tends to assume that everyone is on average white, male, and upper
middle class. They’ve got a point, though, that for pure access to thought and
information, the Internet has done something unprecedented.

But libraries serve a few other purposes that e-books and the Internet can’t solve.
So many of my queer friends pointed out that that libraries were their refuge as
kids and teenagers, from a world that was pretty intent on being horrible to them.
Often they come from families that were more than borderline abusive, and the
library was their safe place. There’s a whole generation of us for whom that rings
true, and kids coming of age now less often say that — but there’s never been
anything to replace that need for them.

Libraries are one of the few first-class public service, one of the few that
historically has ignored what economic class you’re from and has just provided
a service to everyone. That’s starting to change in some ways — inner city
libraries are starting to think of themselves as intervention points for kids
who won’t have access to reading before school, for poor families who can’t
cross that ‘digital divide’ and get on the Internet, they’re buying computers
and setting up more and more space for non-book-oriented services. They’re
focusing on the poor around them and abandoning the universal service model.

(I read a great quote today — “In Europe, public services are for everyone. In
the US, public services are for the poor who can’t afford a private
alternative” — and libraries are one of the few services where that’s not been
true.)

I’ve never been too keen on the model of librarians-as-authorities to appeal to
for information, but even so, having someone who knows the information out
there and can guide you is super important — it’s the role teachers really
should play, but don’t.

There’s a lot of thoughts on this rattling around in my brain trying to escape
coherently, but nothing’s made it out beyond this yet, and certainly not me
figuring out how I fit into it yet. Libraries are in my blood, but I’m not sure
if the thing I’m after is there, or if it’s something more abstract that I’m
chasing.

Anyway just wish we could be sharing another book together.

I’d like that, a lot. I think that’s one thing that’s been lost in the mostly
fast-paced tech words world is sharing thoughts about a big piece of writing. I
comment on blogs and articles, and discuss on Twitter a lot, but books don’t
have the convenient handles where you can just link to them and highlight
something and say “THIS is what’s right about this”. I want to share some of
those things and it’s not happening as much as it used to. I miss sharing them
with you!

Aria

Recipe: Storm in the Garden



## Recipe: Storm in the Garden


#### Ingredients

10 ml lavender vodka 10 ml orange vodka
10 ml hibiscus vodka 200 ml ginger ale
* ice


#### Instructions

1. Drop the ice in a pint glass, pour in the ginger ale. Add the vodkas layered gently on top, ending with the bright red hibiscus.

Preparation time: 2 minute(s)

Cooking time:

Number of servings (yield): 1

My rating 5 stars:  ★★★★★ 1 review(s)

Having vs. Owning | ps pirro

Sometimes people get confused about the difference between having something and owning it.

“I have an ipod” signals ownership. “I have a dog,” or a child, or a spouse, implies a relationship, a mutuality between sovereigns. Things get messed up for us, and for those with whom we are in relationship, when we confuse the one for the other.

Ownership denotes control. Relationship is wrapped up in reciprocity.

Ownership is unilateral. In relationship, something is always owed to the other. Always.

As a general rule, if a thing is alive — and for the animists among us, this includes pretty much everything — what you have is a relationship. Even if the law says otherwise.

Having vs. Owning | ps pirro.

Some thoughts on configuring web servers

If there’s one thing that has always made me annoyed running a web hosting and services business it was the low level details of configuring virtual hosts in Apache and every other web server on the planet.

It’s all scriptable, but it’s error prone and completely graceless.

Users want to be able to define their own rules.

Apache configuration syntax, when included, can break the entire configuration. It’s not dynamic. Reloads in a hot web server can be expensive.

Ngingx and Lighttpd are marginally more consistent, but still stink at delegating.

Configurations are sometimes order-dependent, sometimes evaluated root to leaf node, sometimes leaf node to root, and sometimes require recursing into the request handler to make decisions based on on “what if” scenarios.

I’d willingly trade a lot of power in configuring a web server for something simple and able to be delegated to users.

There are some basic requirements:

  • Ability to configure redirects (and custom responses) for specific URLs and for entire subtrees of URL space. (I’m of the opinion that this should often not be handled at the application layer, since it’s most often needed to deal with changes to URL structure during application upgrades and transitions.)
  • Ability to map URLs to handlers located within the document root, without exposing the filenames of those handlers. (Thank you, PHP, for moving us backward 5 years in URL structure in an effort to teach us how simple deployment should be.)
  • The ability to direct entire subtrees to a handler.
  • The ability to direct entire subtrees to a handler if the request is not satisfiable with a url-to-path mapping.
  • The ability to direct requests to a handler if url-to-path mapping yields a file with a particular suffix (or perhaps indirectly via MIME type)
  • The ability to tweak url-to-path mapping if url-to-path mapping yields a directory.
  • The ability to add a variable passed on to a handler at any point in the subtrees of URL space, including setting it to a value from any part of the request headers, including a fragment of the URL.

And operationally, I want to be able to delegate the configuration of entire virtual hosts and preferably also subtrees of URL space to users, and have them only able to break the parts delegated to them.

1944 Red Velvet (cup) Cake

This is a red velvet cake made in a WW2 era way, using beets for moisture and color. The trick to getting good color rather than mud is to keep the batter acidic: lemon and buttermilk and a complete lack of alkaline leavening are what make this recipe unusual.

Boil two medium beets and puree. (You need one cup)

Cream two sticks of butter with a cup of sugar. Beat in two eggs as completely as you can.

Mix two tablespoons of lemon juice with 3/4 cup buttermilk. Add a cup of the beet puree.

In a bowl, mix a cup of flour and a quarter cup of natural (non-Dutch process) cocoa powder. (I used Hershey’s).

Beat the three mixtures together, adding some of the butter, egg and sugar mixture alternating with some of the beet and buttermilk mixture.

Pour into greased cupcake pans and bake at 350 until a toothpick or straw comes out clean.

This will be a soft, moist cake, almost custard. It released from the pan easily for me, though my cupcake pans are cast iron and a little unusual.

I used most of my batter as a layer under a cheesecake, but that’s a story for another time.

The world is a complicated place

Part 1 of an ∞ part series

:; cal 9 1752    September 1752 Su Mo Tu We Th Fr Sa        1  2 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

Cephalapod Surprise Chowder

Chop four slices of bacon and start cooking them over medium heat.

Chop one small onion, three small carrots, two sticks of celery.

Add them to the cooking bacon, with the fat.

Let them cook until the onions start to go transparent.

Add a cup or two of beer.

Add 3 or 4 fingerling potatoes, cut into small bite sized pieces.

Add water to cover and let this simmer until the potatoes are soft.

Chop four or five small squid into half-inch square pieces. Tentacles can be left in larger pieces.

Put these in a pan with a few tablespoons of melted butter. Cook briefly until the squid firms. (Ten or fifteen seconds, thirty at most.)

Add the squid to the simmering potato mixture.

Add a cup or two of small scallops, and a cup of small shrimp.

Cook a roux, equal parts butter and flour until the flour is golden-brown.

Add it to the simmering mixture and whisk to combine, and remove the heat.

Add 3/4 cup of heavy cream, or 1 1/2 cups of half and half.

Let stand for a bit, and serve.

Season with salt and pepper, and add a quarter cup of chopped fresh dill.

Simmer until warmed through again. Don’t let the scallops overcook.

Let cool slightly

A simple primer on cryptographic primitives

A field guide

Or “don’t trust anything that screws these up even slightly.”

Key

A private, hard to guess piece of information, meaningless on its own, but used to secure other pieces of information.

Public Key / Private Key

Specifically, these are keys with certain properties: They come as a pair, they’re usually a couple prime numbers (which are mathematically hard to factor, which is where their security comes from)

Things encrypted with one key can be decrypted with the other and vice versa.

Hash

Using a cryptographic hash function (which is often based on an encryption function, but not always) takes an often big piece of information and turns it into a fixed length token that represents it, in a hard to fake way. Even small changes will make a cryptographically strong hash function change its output entirely.

Some example hash functions: MD5, SHA1, SHA256, SHA512

Signature

The result of using a key and a hash function together on a piece of information to give some proof that the information wasn’t forged. If the key and signing algorithm used are public/private paired keys, then the public key can verify that the information was signed by the private key.

Certificate

A signature on a public key, and usually some ID information. If the certificate was signed by a trusted party (trust is a complicated thing, though) then there’s usually some assurance that the information signed by the the private key that matches the certificate is from a known source. Of course, can you spot a forged ID?

HMAC

A way of hashing information with a key securely to form a signature that can’t be altered. Turns out that if you just start with the key and add data to the end of it, then hash that, an attacker can keep adding things and keep running the hash function from where it left off and the signature will look valid. HMAC mixes the key with the information being signed in a way that prevents this.

Salt

When you’re using a hash to make information hard to brute-force, you make sure that an attacker can’t just build a list of all the likely things and see if you have them by adding randomness to the thing you’re hashing. Now, since this changes the hash value, you have to include it in a way that the thing comparing the hash can do the same way, so a salted hash often looks like data + salt = $salt$HASH. Usually this is combined with a very slow, hard to do hash function, so you can’t just whip through all the possibilities on a fast computer in a day or two. Computers keep getting faster, though…

TL;DR

Key = random unguessable; Key + hash = signature, signature + keypair = certificate; Hash + salt = hard to crack hash + salt.

Quote: Design Philosophies of Developer Tools

One of the nice things about Git is how its internals are both exposed for the world to see and thoroughly documented. We can easily write scripts to automate common tasks or create different workflows. With a bit more effort, we could even write new tools that integrate with the Git suite. These tools can do things that Git’s authors never intended, as long as they follow the documented repository structure. Git isn’t so much a version control system as the means to construct one.

And</>

All of the Ruby development tools have independent release cycles, and they don’t seem to plan or coordinate with one another in advance of each release. Integration testing is left up to the users.

Design Philosophies of Developer Tools (via Digital Digressions by Stuart Sierra)

Some very good thoughts.

Gluten-free Crullers

Boil 1 cup of water

Add 2 sticks of butter and let it melt completely

Remove the mixture from heat.

 

Add 1 cup of Glutino all-purpose gluten-free flour. Another mix with some bean flour might have better texture at the expense of flavor.

Add 1/2 teaspoon Xanthan Gum

Add 1 tablespoon tapioca flour

 

Beat and add three eggs, one at a time, incorporating completely.

The dough will start out the texture of mashed potatoes, but eventually become a soft, pliable consistency between dough and batter. Work the batter hard until it’s completely smooth.

 

Heat oil for deep frying to 375 °F.

 

Fill a pastry bag with the dough, cut a 1/2" hole for the tip. Squeeze sticks or curls into the hot oil carefully. 

 

Fry until golden.

A great idea for SVG fonts

From this www-font posting by Adam Twardoch

INTRODUCTION

Obviously, SVG Fonts have some good and interesting concepts. One of

their advantages is that they can — at least in theory — freely

combine all aspects of SVG: multi-colored, multi-layered vector

graphics, and bitmaps.

However, SVG Fonts also have some serious drawbacks: while the glyph

definition using SVG is a great concept, all the other aspects of SVG

Fonts that make them work as a font, especially the character mapping,

access to alternate glyphs, and the layout behavior, are somewhat

under-defined and hard to implement. Therefore, it’s rather unlikely

that at any time, all OS and application vendors will agree on a good,

full implementation of SVG Fonts.

Therefore, I’d like to suggest a different path: place an SVG Font as a

table inside of an OpenType font*, and combine the power of both formats.

Thoughts on a (maybe) sane build system.

I’ve been thinking about build systems a lot this week, thanks to V8’s terrible use of scons, its replacement, gyp, being a broken pile of steaming still, and everyone’s collective hatred of autoconf.

I think Guru (vaporware with a good idea) is onto something, though.

I think a lot of build systems are too process-focused, which is exactly the path that leads to platform dependence and the craziness that’s come before.

If each module declares what it should know about the process, that’s a start: main.c knows that it’s the entry point of a program. It can say so. foofuncs.c knows that it’s the implementation of the functions defined in foofuncs.h, but it doesn’t know whether it’s destined for a static or dynamic library, or even just #includeed into other code. It can declare the parts it knows about.

Then, there can be module-level declarations: “These things form a coherent library”, “These parts are required for feature frobnicate”, “this must be linked with library having function foo”

Then at the package level, one has to configure major options — are we installing in an application specific root, or a system-wide one? Are we building full-featured or light? Cover-your-ass static linking of everything for a build that works everywhere, or shared-everything to play nice with the specific system being installed.

What do you want a build system to do, and more, not do?

Oh, the unsuspected woes of server migration

Last night, I migrated The Internet Company’s servers from a Linux-VServer host to an OpenVZ host. It all went well, except one crazy detail.

The OpenVZ host runs on CentOS, and apparently its way of calling gettimeofday(2) doesn’t agree with PLD’s glibc’s way. Specifically, the 9th bit of the resulting time is wrong … some of the time.

`
with kernel.vsyscall64=0:
2011-05-09 17:14:48 -0600 1304982888 1001101110010000111010101101000

with kernel.vsyscall64=1:
2011-05-09 17:19:56 -0600 1304983196 1001101110010000111011010011100

with kernel.vsyscall64=0:
2011-05-09 17:15:04 -0600 1304982904 1001101110010000111010101111000
`

Fixed!

So what happens is that you get errors like Dovecot complaining of things like this:

May 9 15:36:31 host dovecot: pop3: Fatal: Time just moved backwards by 290 seconds. This might cause a lot of problems, so I'll just kill myself now. http://wiki2.dovecot.org/TimeMovedBackwards

since the mail server needs to know what time things arrived, and the date going backward is a sign that the clock is Not Reliable.

It also gives errors like this when things move forward again:

May 9 15:33:54 host dovecot: imap-login: Error: master(imap): Auth request timed out (received 0/12 bytes) May 9 15:33:54 host dovecot: pop3-login: Error: master(pop3): Auth request timed out (received 0/12 bytes) May 9 15:33:54 host dovecot: imap-login: Error: master(imap): Auth request timed out (received 0/12 bytes)

So the solution? set kernel.vsyscall64=0. It fixes the mismatch between guest and host on OpenVZ, making the five-minute jump disappear. Just pop that line into /etc/sysctl.conf and then apply it with sysctl -p.

Using the Learning Puppet VM under VirtualBox

First, get the VM from PuppetLabs.

Unpack the .tar.bz2 file so you see the learn_puppet_centos.vmwarevm directory.

Start VirtualBox, and create a new VM. Set it for Redhat Linux, not 64-bit. 512mb RAM is fine.

Add the cent55_386_ks.vmdk or a dummy image as the hard drive.

Save the VM, but don’t start it yet.

Edit the settings for the VM, and remove the SATA controller, and add the disk image to the IDE Controller. The VM only supports the PIIX controller, so the AHCI SATA controller that VirtualBox 4 uses by default won’t work.

Boot! Enjoy! The login is root and the password is puppet.

On statistics

A friend asked me today what the standard deviation means of something that’s not normally distributed.

I had to answer “not terribly much”: an average, and a standard deviation, are good measures when things follow the normal distribution, where things cluster around a center point.

So what, then, is the right tool for his case, a long tail distribution? Most of his users last a certain number of months of service, and then each successive lengthening of the term has fewer and fewer users. I suggested percentiles or quartiles — show what that long tail looks like, and see where most of the users fall, where most is some interesting portion like 1/2 or 2/3.

All this comes down to estimating average lifetime revenues of customers of a business that isn’t all that old nor all that huge. It means the margins for error are larger, thanks to the relatively small populations.

At some point, I’ll have to revisit this post and add some graphs.

A great idea

From Patryk Zawadzki:

Here’s an idea for GNOME 3.x. Instead of showing a static wallpaper, start treating the wallpaper as an infinite plane. Basically instead of using a JPEG or PNG file as input, build a library that given a rectangle returns the image data (raster or even better vector) corresponding to the surface it covers. As monitors and workspaces come and go, the shell can expand and contract the background, calling the library as needed to build the missing parts.

Awesome! And parallax, multiple monitors. Great idea!