Archives for category: web

Our research lab is non-profit, but private GitHub repositories still cost money, so I have been playing with GitLab Community Edition to serve up some private Git repositories from a third-party host on the cheap.

Before using GitLab CE, I had set up a Git repository that, for whatever reason, would not allow users to cache credentials and would also not allow access via https (SSL). It was getting pretty frustrating to have to type in a long string of credentials on every commit, so setting up a proper Git server was one of the goals.

Installing and setting up the server is pretty painless. After installing all the necessary files and editing the server’s configuration file, I go into the GitLab web console and add myself as a user, and then add myself as a master of a test repository called test-repo.

When I try to clone this test repository via https, I get a Peer's Certificate issuer is not recognized error, which prevents cloning.

To debug this, Git uses the curl framework, which I put into verbose mode:

$ export GIT_CURL_VERBOSE=1

When cloning, I get a bit more detail about the certificate issuer error message:

$ git clone https://areynolds@somehost.lab.org:9999/areynolds/test-repo.git
Cloning into 'test-repo'...
* Couldn't find host somehost.lab.org in the .netrc file; using defaults
* About to connect() to somehost.lab.org port 9999 (#0)
* Trying ...
* Connection refused
* Trying ...
* Connected to somehost.lab.org (127.0.0.1) port 9999 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* failed to load '/etc/pki/tls/certs/renew-dummy-cert' from CURLOPT_CAPATH
* failed to load '/etc/pki/tls/certs/Makefile' from CURLOPT_CAPATH
* failed to load '/etc/pki/tls/certs/localhost.crt' from CURLOPT_CAPATH
* failed to load '/etc/pki/tls/certs/make-dummy-cert' from CURLOPT_CAPATH
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: /etc/pki/tls/certs
* Server certificate:
* subject: CN=*.lab.org,OU=Domain Control Validated
* start date: Oct 10 19:14:52 2013 GMT
* expire date: Oct 10 19:14:52 2018 GMT
* common name: *.lab.org
* issuer: CN=Go Daddy Secure Certificate Authority - G2,OU=http://certs.godaddy.com/repository/,O="GoDaddy.com, Inc.",L=Scottsdale,ST=Arizona,C=US
* NSS error -8179 (SEC_ERROR_UNKNOWN_ISSUER)
* Peer's Certificate issuer is not recognized.
* Closing connection 0
fatal: unable to access 'https://areynolds@somehost.lab.org:9999/areynolds/test-repo.git/': Peer's Certificate issuer is not recognized.

Something is up with the certificate from Go Daddy. From some Googling around, it looks like nginx doesn’t like using intermediate certificates to validate server certificates.

To fix this, I concatenate my wildcard CRT certificate file with GoDaddy’s intermediate and root certificates, which are available from their certificate repository:

$ sudo su -
# cd /etc/gitlab/ssl
# wget https://certs.godaddy.com/repository/gdroot-g2.crt
# wget https://certs.godaddy.com/repository/gdig2.crt
# cat somehost.lab.org.crt gdig2.crt gdroot-g2.crt > somehost.lab.org.combined-with-gd-root-and-intermediate.crt

I then edit the GitLab configuration file to point its nginx certificate file setting to this combined file:

...
################
# GitLab Nginx #
################
## see: https://gitlab.com/gitlab-org/omnibus-gitlab/tree/629def0a7a26e7c2326566f0758d4a27857b52a3/doc/settings/nginx.md

# nginx['enable'] = true
# nginx['client_max_body_size'] = '250m'
# nginx['redirect_http_to_https'] = true
# nginx['redirect_http_to_https_port'] = 443
nginx['ssl_certificate'] = "/etc/gitlab/ssl/somehost.lab.org.combined-with-gd-root-and-intermediate.crt"
...

Once this is done, I then reconfigure and restart GitLab the usual way:

$ sudo gitlab-ctl reconfigure
$ sudo gitlab-ctl restart

After giving the server a few moments to crank up, I then clone the Git repository:

$ git clone https://areynolds@somehost.lab.org:9999/areynolds/test-repo.git
Password for 'https://areynolds@somehost.lab.org:9999': ...

I can even cache credentials!

$ git config credential.helper store

Much nicer than the previous, non-web setup.

Cartograms with d3   TopoJSON   Same Sex Marriage

Shawn Allen wrote a d3.js-based implementation of a 2D cartogram, which sizes US states in an area-proportional manner, where area is based on some interesting statistic, like population.

There has been a great deal of progress made in the last year in defending the rights of GLBT Americans to marry and have their partnership rights acknowledged, rights like visitation and estate planning, rights that straight couples take for granted when visiting their loved one in the hospital, or sharing their lives in the house they own, etc.

It’s easy enough to see a map of the 50 states colored by legal status, but people are not spread out evenly to live across all states. I wanted to see how the United States was progressing as a factor of population.

I forked Allen’s project (GitHub project source code available here) and I redid the color scheme, which takes the 50 states and the District of Columbia and shades them by their legal status, whether their laws defend or remove same-sex marriage rights (and associated protections).

Green states allow same-sex marriage, light-green states allow civil unions, orange allow marriage or civil unions (but rulings are currently held up on appeal), and red states that do not defend same-sex marriage rights, either by explicit law or constitutional amendment.

I based the color assignments initially on data from the Right to Marry site, up-to-date as of May 19th, 2014. But with Pennsylvania’s Gov. Corbett conceding defeat and vowing not to appeal the ruling, I added Pennsylvania to the list of pro-equality states.

In addition to seeing how fast things have changed, what is also interesting is that drawing by area quickly shows that over half the country — by 2010 US Census population counts, at least — now enjoys (or will soon enjoy, pending appeals) legal protections that were once denied to a minority of Americans.

When I want to make BEDOPS documentation available for offline browsing, here is the command I use:

wget –no-parent –recursive –page-requisites –html-extension –convert-links -E -l 2 http://code.google.com/p/bedops

This handy wget statement fixes up the img and other URL references so that links to images and other resources load up from the local copy. So what the end user sees is almost exactly like what she would see if the documentation was retrieved through a network connection. Very nice.