Ruby

We do a LOT of file processing, bringing in customer data for analytics, so we’re always looking for ways to make things faster. The CSV class built into Ruby is great, but we found that if we could stand a few less options, we could make it a good bit faster while staying in Ruby. Assumptions: CSV file has a header You want a hash emitted for each row - (headers are converted to symbols) You don’t need converters - all values are strings Most rows in the CSV file do not have quotes What does it do?

Recently we were struggling with some performance issues - importing data was taking way longer than it should. Finding the problem Ruby profiler to the rescue! This gem represents everything I love about the Ruby community. Its open source, its intuitive, and it simply just works so I can back to building applications. Profiler setup The simplest setup to test specific code, comes straight from their README: require ‘ruby-prof’ # Profile the code result = RubyProf.profile do …

Sensitive Data in MongoDB At KoanHealth, we work with healthcare data and use Ruby and MongoDB as our primary database to store patient data. This means that is has to be secure at rest and over the network. We looked at a few gems for storing encrypted data, but they: Weren’t specific to MongoDB, and felt clunky Made transitioning between raw and encrypted data difficult Existing solutions There are some solutions, but we really just needed a gem that: Encrypts data at a field level Makes it easy to access the raw and encrypted value Allows developers to use their existing encryption gem Encrypted fields So we’ve built mongoid-encrypted-fields.