written by Pierre-Alban Dewitte and was originally posted on his Tumblr.
With my dear friend Bruno we spent a few hours in the past weeks to build an online multi-player quiz. The goal was to learn how to use new tools and libraries then share our work during the Summer of APIs hackathon.
Choosing our data-set
We have both lived in Nantes for a long time and we do not know the story of the famous people from here. All of them have a park, a street or an avenue named after them. They sound familiar but we do not know their story. We decided to use the open data set containing all the streets of our city and mix it with the Wikipedia list of famous people related to Nantes to build a multi-player quiz. Very basically, you log in, you then have a question prompt and three propositions. It is up to each player to respond as fast as possible.
When you want to have fun and quickly share your idea, Node.js is the only solution and the best cloud provider to host is Clever Cloud. No discussion there.
We both have skills in angular and did not see any reason to experiment a new library. We just tried out getmdl.io in order to have a “Material” view. If you try a quiz you will quickly see that design is not one of my skills.
Coordinating clients and server
In a multi-player quiz, the coordination between every players is key. To achieve it, we need a fast and reliable communication protocol. In a previous experience we tried to build a video chat with webrtc. Despite a lot of effort we did not manage to make it work as expected but we really saw a big potential in web-socket. We really wanted to show the power of them in our new application. That’s why we decided to use socket.io. It is a very elegant and simple to set-up library. The most tricky part was to specify all the client states we needed in order to build the server the right way. Thanks to socket.io we only focused on the quiz logic and the mapping of external API.
Here is the first version of the flow. This picture helped us a lot in building our app.
After defining the flow we built a state diagram and we were ready to code.
Serving the questions
Our first option was to use the Wikipedia API directly to build questions from the first sentence of the selected article. It appeared quickly that we would also need to link this question to the right answer, the proposition and the words to hide in the Wikipedia article – it is not always the same rule, for example for Lamoricière we needed to hide Lamoricière but also Moricière. As we really wanted to go fast and did not want to use all Wikipedia articles we decided to store our questions in a Google sheets and use it in our application through APISpark.
Setting-up an API with data stored in a Google sheet
With APISpark, serving a Google sheet through an API was very quick. First we had to create an entity store and link it to our Google account.
Once you chose the spreadsheet, each column name is used as a property for the entity by default. The first column is used as a primary key.
To expose this data as an API you then needed to deploy and export it. The position of this option in the user interface is not obvious.
After that we just opened up the API to anybody by changing the settings of the generated web API.
You see, it was very simple to expose the data. You understand now why we did not focus on automatically grabbing data from Wikipedia.
Adding more information with answers
Once we got the API and the flow with a nice technical stack, building screens and server was a matter of time. We also wanted to give more information when serving an answer like showing to user, the views of streets named with famous people of our city. For that we need to find the street named with the famous people, attach it to GPS coordinates and use the Street Views API. A data-set of street names exits for our city. We tried to load it in a Google spreadsheet and search for street in it with APISPark but currently you can not search with a non exact match in columns values. I am pretty sure it will come one day. In order to full-fill our need we had to load the data in a MongoDB instance hosted in MongoLab (we could also have used the Clever Cloud service) and query them to find all the streets names. We used mapquest to get the GPS coordinate from street name and then built the street view URI.