debuggable

 
Contact Us
 
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9

Private npm modules

Posted on 8/9/11 by Felix Geisendörfer

Thanks to Isaac, npm is getting more and more awesome by the hour. One of the coolest recent additions (you need at least v1.0.26) is the ability to specify private git repositories urls as a dependency in your package.json files.

At transloadit, we are currently using the feature to move some of our infrastructure code into separate packages, allowing for those to be tested and developed in isolation making our core application easier to maintain and work on.

The syntax for referencing a git repository (and commit) is as follows:

{
  "name": "my-app",
  "dependencies": {
    "private-repo": "git+ssh://git@github.com:my-account/node-private-repo.git#v0.0.1",
  }
}

This will include a private npm module called "private-repo" from GitHub. The url also contains an optional refspec (#v0.0.1) that tells npm which branch, commit, or in this case tag you want to have checked out.

Now of course this is not the only way to do private npm repositories, but it is much simpler than running your own registry, so I would recommend it to most people.

Before you head of to play with this, here is a final tip that may safe you some headaches. In all your private npm modules, add "private": true to your package.json. This will make sure npm will never let you accidentally publish your secret sauce to the official npm registry.

Happy hacking, --fg

PS: When deploying, don't forget that you need to authorize the servers ssh key for the GitHub repository you are depending on.

 

How to fork & patch npm modules

Posted on 26/7/11 by Felix Geisendörfer

With now more than 3000 modules, there are huge gaps in the quality of things you find in the npm registry. But more often than not, it's easy to find a module that is really close to what you need, except if it wasn't for that one bug or missing feature.

Now depending on who is maintaining this module, you may get the problem resolved by simply opening a GitHub issue and waiting for a few days. However, open source doesn't really work without community, nor do you always want to be at the mercy of someone else. So a much better approach is to actually roll up your sleeves, and fix the problem yourself.

Here is the proper way to do this while using npm to manage your forked version of the module:

  1. Fork the project on GitHub
  2. Clone the fork to your machine
  3. Fix the bug or add the feature you want
  4. Push your commits up to your fork on GitHub
  5. Open your fork on GitHub, and click on the latest commit you made
  6. On the page of that commit, click on the "Downloads" button
  7. Right click on the "Download .tar.gz" button inside the popup, and copy the link ("Copy Link Address" in Chrome)
  8. Open up your package.json file, and replace the version number of the module with the url you just copied
  9. Send a pull request upstream (Optional, but this way you will avoid having to maintain that patch of yours against newer versions of the module you forked)

Example: My new airbrake module uses a forked version of xmlbuilder. I submited my fix as a pull request, but it has not been merged yet. In order to pull in my changes via npm anyway, I simply pointed my package.json to the download url of my fork on GitHub like so:

"dependencies": {
    "request": "1.9.8",
    "xmlbuilder": "https://github.com/felixge/xmlbuilder-js/tarball/4303eb2650a4b819a980b1dc9d2965862a1e9faf",
    "stack-trace": "0.0.5",
    "traverse": "0.4.4",
    "hashish": "0.0.4"
  },

Alright, let me know if this is helping your node.js adventures, or if you have an alternative workflow you are using. Otherwise, happy hacking!

--fg

PS: You should upgrade to the latest npm version first, some older versions had problems with handling url dependencies properly.

 

Node.js Workshop in Cologne, June 10th

Posted on 20/5/11 by Felix Geisendörfer

We apologize for the short notice, but if you are looking to put node.js in production, this full day node.js workshop we are organizing is where it's at!

The workshop is happening on Friday June 10, one day before nodecamp.eu. Space is limited to 15 people and expected to sell out quickly.

As a reader of our blog, you can get a 15% discount on the regular ticket by using the code 'debuggable'.

Should you attend?

This workshop will teach you everything you need in order to write and deploy powerful node.js applications. We'll try to cover a lot of ground, so if you are interested in any of the following, you should definitly attend:

  • Setting up node.js on your local machine
  • Understanding the module system
  • Using npm for installing and upgrading modules
  • Publishing your own npm modules
  • Everything you need to know about http.Server
  • Structuring your code using OOP in JavaScript
  • Dealing with all the callbacks in a sane fashion
  • Using the same code in node.js and the browser
  • Building realtime apps with Socket.IO
  • Using the express framework
  • An overview over testing tools available for node.js
  • Deploying node.js to Ec2 / Joyent

The first half of the day will be guided by slides (which will be made available afterwards), with the second half being a hands-on session where we will build a small node app from scratch.

About the instructor

This workshop will be led by Felix Geisendörfer (that's me). He is one of the earliest and most active contributors to node.js, author of over 20 npm modules and also running one of the biggest node.js applications in production over at transloadit.com.

In addition, Tim Koschützki who is also a co-founder at transloadit, will be available all day to help with individual questions and trouble shooting.

Questions & More Workshop

If you have additional questions or can't make it to this workshop, please head over to the workshop page which has information on other upcoming workshops and questions.

--fg

 

Why are you not using a SSD yet?

Posted on 23/2/11 by Felix Geisendörfer

If you are a developer, and you have not switched to a SSD yet, what is your excuse?

Let me explain. I've switched to an SSD a little over a week ago, and it's a different world. You know that feeling of having just bought & setup a new machine and everything still runs very fast? Well, a SSD will make every single day feel just like that, except much faster.

But I already knew that, so why has it taken me, and apparently you who is reading this, so long?

Well, my main problem was that I have a few big things on my hard disk, namely music, photos and virtual machine images. This means that I need a hard disk of ~300 GB to work comfortably. However, the SSD I was interested in only comes in 40, 60, 120, 240 and 480 GB. The 480 GB costs ~$1.580 right now.

A 240 GB SSD costs ~$520 which seems much less outrageous, but unfortunately that's still too small if it was my only disk.

So for a while, I thought I'd have to wait another 1 - 2 years before enjoying the SSD experience. That was until I came across this article which explained that you could replace your MacBook Pro's optical drive with an SSD. This means I could add an SSD to my machine without giving up the luxury of cheap mass storage.

With this in mind, I decided to get a 120 GB SSD, which is plenty of space for my core system and applications. I followed a few youtube videos for swapping out the disks, and I also placed my previous hdd in the optical bay slot since I've heard reports of hibernation problems if you put your primary disk there.

Making the new SSD my primary hard disk was easy as well. My initial attempt using time machine failed, so I simply booted up my system from the old primary hdd, and used carbon copy cloner to copy all data (excluding my music, images and vms) to my new SSD. After that I made the SSD my primary boot disk using the "Startup Disk Preference Pane" and rebooted. The whole operation took about 1-2 hours.

So how has this changed my life? First of all, boot time is incredible. Compared to Tim's mac (which is now scheduled for an upgrade ASAP as well), my machine goes from 0 to starting Photoshop in 48 seconds. Tim's machine takes 2 minutes and 50 seconds. Note: It takes about the same time for both machines to boot the kernel, but my machine is instantly ready at that point now.

Starting programs is either instant or 2-3 times faster than before. Recursive grep (using ack) is insanely fast / instant, even on big project. And git - it's a different world. If you've ever waited for minutes while running 'git gc' on a big project, an SSD turns this to seconds. Everything feels just incredibly fast.

With this in mind, what's your excuse for not treating yourself to a SSD now?

--fg

PS: If you think you would miss your optical drive: You can get an external USB one for ~$40 on Amazon. If you really need the internal one back, I guess it would take you about ~10-15 minutes to put it back in once you know the procedure.

PPS: If you're worried about the difficulty of replacing the disk: It's very easy, all you need to know is how to operate a screw driver. However, make sure you've got the right tools. The OWC disk I'm recommending comes with a set of tools if you order it with the data doubler for the optical bay.

PPPS: My friend Joel pointed out the lack of TRIM support in OSX as a reason for not getting an SSD yet. That's a valid argument, but the OWC discs do not suffer from the lack of TRIM.

 

Talks, talks, talks

Posted on 18/2/11 by Felix Geisendörfer

I've been in Atlanta for the past two weeks, and thanks to the kind help of a few folks, I was able to present at 2 meetups, as well as Startup Riot 2011.

First up was a new talk at the Atlanta Ruby Meetup:


The talk was an attempt of taking out all the cool-aid and hype and focusing on node's true strength as well as weaknesses. I think the whole thing was very well-received, but I certainly could have done a little better on the delivery.

Download: nodejs-should-ruby-developers-care.pdf (733 KB)

Next up was the 4th edition of my general introduction talk to node.js:


This version of the talk was updated for the freshly released v0.4, and I've also tweaked some other slides to the point where I'm very happy with it. It seems to do a great job getting people excited about node, as well as highlighting sensible use cases.

Download: nodejs-a-quick-tour-v4.pdf (610 KB)

And last but not least, I had the chance to do a 3 minute pitch for Transloadit at Startup Riot:


Download: transloadit-startupriot.pdf (640 KB)

Doing the actual presentation was quite scary. I have never given a talk this short, you basically don't get any time to warm up and get into things. You got to go out, and give your best right away. Having an audience of ~500 people didn't help either.

However, I think I pulled it of fairly well. My main message was: "Save the time, save the money, save the shrink - use transloadit", and I highlighted some of the cooler aspects of our service such as the realtime encoding. Lots of people came by to our table afterwards to find out more about the service, including a few VCs and angels (we're not looking for investment right now, but seeing their interest feels good regardless : ).

There are a few more talks coming up in the next couple of months, but I also hope to find some more time for actual blogging again. I certainly want to start writing a few articles about testing JavaScript.

--fg

 
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9