The first week of Badger Academy

The first week of Badger Academy


Last Wednesday marked the beginning of Red Badger's intern training program - Badger Academy. As interns, Eric and I will be the 'prototypes' with a dynamic syllabus covering the fundamentals.

Our guidance consists of a full day a week with a senior developer, and in this case we'll be re-engineering an internal app that was started two years prior but left unfinished.

Badger Time Reborn

Badger Time, wittily named by Cain, was to be a resource management platform. At a basic level, it would enable a business owner to plan potential projects and analyse ongoing projects, calculating financial figures based on how many hours people were assigned to projects and how many hours they'd fulfilled (using data from the FreeAgent platform).

design

We collectively decided that the best course of action was to build it up again from scratch, as it would take less time and effort than fixing what was wrong now, using the old codebase as a reference point.

As far as any intern is concerned, writing any software from scratch is a daunting task. We are taking the briefing with a positive attitude, and reveled in the prospects of being able to learn every aspect of building a working, maintainable piece of software.

Planning

The first stage of any structured task is planning - what do we want, how do we want it and what problems will we face? Thankfully, we had the ability to recycle the designs from the old project which simplified a lot of the what and the how.

User Stories

The bulk of last thursday was spent on these. Viktor, the senior assigned to us for the day, took us through building a backlog of user stories for each feature that would be in the minimum viable product. Building these user stories helped us to understand these features from a user's point of view, and simplified the process of figuring out potential problems. We used Trello to organise these, as it allowed us to sort the backlog by priorities and establish a pipeline of things to get done. 

Building a data model

As we'd be handling large amounts of data coming from different sources, it was imperative that we had a well-built data model. There were two main factors to keep in mind:

  • Avoid repeating the same data in various places
  • Avoid storing anything in the database that can be calculated on demand

Technology

Docker - the foundation of our project

We made a few ambitious decisions regarding tech choices in Badger Time. We'd be using Docker to handle our development and production environment. Both the seniors and us were super interested in this new technology as it would solve a lot of current problems. Right now, most of the projects are handled using Vagrant for a virtual machine and Ansible for provisioning. This poses a performance hit as everything is ran on a virtual machine and it can also take upwards of 30 minutes to get it up and running on a new machine. Docker eliminates this by running everything in containers, which are like 'layers' on top of the current host machine, and containers can be built (similar to provisioning) once, pushed to a remote server and then downloaded and ran on any machine capable of using docker.

Because docker containers are purely layers on top of the existing system, they are much smaller and more portable than a full-blown virtual machine. It also means we eliminate any discrepancies between development and production, allowing for a much smoother deployment process.

Rails - the trustworthy powerhouse

We'll be using Ruby on Rails to write a RESTful API which will handle any requests and serve data from our database, as well as make frequent syncs of data from FreeAgent. Ruby on Rails is solid, easy to read and write and provides a large repository of 'gems' which allow us to extend the functionality of our app easily. It was an easy, safe choice, and was backed up by the fact that the old Badger Time was written completely in Rails and we could recycle some of the code as most of the business intelligence was up-to-date.

React.js and LiveScript - lightning fast rendering, with clean and structured code

Rather than making an isomorphic app, we took the same design principles as Haller and divided the backend and frontend of the app. This enables our app to scale much more easily - we can serve the frontend as a static site on a CDN like Amazon S3 (fast!) and then scale the backend seperately. Using React and LiveScript, we can build a purely functional frontend - ditching the traditional MVC application model in favour of having our UI broken up into simple components which contain logic within themselves (and are ridiculously fast because of how React works).

Compare the following (functionally) identical pieces of code;

You don't need to understand what's happening to notice how much simpler and cleaner it looks with LiveScript! You can read Stuart's post on this very same stack for a deeper understanding of why its awesome. We love it and we're sticking to it!

So as you can see, its a pretty ambitious proposal full of new, exciting stuff, and this project is the perfect opportunity to test it all out on! We're keen to get the show on the road and get into the meaty part of the development work, but we're also eager to build something that's slick and will be considered as a solid, well-written codebase. I have high hopes that Docker will become a thing around here, and it might catch on as the go-to tool for handling DevOps around here, just like what React and LiveScript do for frontend!

Similar posts

Are you looking to build a digital capability?