Blog

- April 06, 2016

We want to make our data available to the greatest number of clients. For this we need to remove the technical barriers. As a matter of fact, each database server provides its own protocol to interact with it. This means that a driver is required at the client side to be able to interact with them.

01-data-driven-web-apis

Before diving into how to configure and use this feature of APISpark, let’s describe the database we will use all throughout this article.

The database

For the article, we chose a Mysql database that is hosted on Amazon. This provides the expected security level.

We created a simple schema with two tables, Customer and Order, that define a 1:n relationship between them. A customer can contain a set of orders and an order is linked to a specific customer.

02-database-schema

As a hint, this schema was created with the following DDL script:

CREATE TABLE T_CUSTOMER (
    id INT NOT NULL AUTO_INCREMENT,
    name VARCHAR(255) NOT NULL,
    PRIMARY KEY (id)
) ENGINE=INNODB;

CREATE TABLE T_ORDER (
    id INT NOT NULL AUTO_INCREMENT, 
    customer_id INT NOT NULL,
    amount DOUBLE NOT NULL,
    INDEX customer_index (customer_id),
    PRIMARY KEY (id),
    FOREIGN KEY (customer_id) 
        REFERENCES T_CUSTOMER(id)
        ON DELETE CASCADE
) ENGINE=INNODB;

Now let’s tackle the APISpark part.

Wrapping the database in APISpark

The first thing to do consists of creating an entity store in APISpark. In this case, it’s not the built-in datastore, but one of type “SQL wrapper”.

When selecting this kind of store, the dialog lets you fill in all the details to access your database:

  • Database type and driver to use
  • Host name
  • Port number
  • Database name
  • User / password

The following figure describes this creation dialog:

03-sql-store-creation-new

We can test the connection at this level to ensure that you didn’t make mistakes when filling the form.

When clicking on the “Add” button, the store is created and entities added by introspecting the database. In our case, two entities are automatically added for the T_ORDER and T_CUSTOMER tables. Their structures is similar to those of these tables.

04-store-overviewWe can leverage all the features of entity stores for the created store. After having deployed the store, we can access the entity browser to be able to manage data directly from the Web UI.

For this, simply click of an entity in the left menu and then on the “Browser” tab in the right panel. We can finally click the “Add” button. A dialog opens and we can fill the details about the entity to create.

05-entity-browser-add

Notice that only the name property is displayed in the form. It’s because the id one was configured as auto generated when creating the table.

After clicking on the “Add” button in this dialog, the element is saved in the database and appears in APISpark.

06-entity-browser

The identifier of the element (its id property) was automatically assigned. APISpark lets the database handle this because of the AUTO_GENERATED specified when creating the T_CUSTOMER table.

If we have a look in the database, the corresponding row is there:

07-database-rows

Handling schema updates

The database schema could also evolve by adding or removing columns. APISpark provides support to handle this through its “Synchronize” feature. In fact, this feature is implicitly used when creating the SQL entity store but it can be explicitly executed to re-synchronize the store.

To use this feature, simply go to the “Settings” tab and click on the “Synchronize” button.

08-store-configuration

 

Once the synchronization is done, a message is displayed and entities are updated within your entity store.

09-store-configuration-synchro

Imagine that we had a column to the T_CUSTOMER table with the following DDL command “alter table”:

ALTER TABLE T_CUSTOMER ADD address VARCHAR(255);

We will see a corresponding new property called address in the T_CUSTOMER entity:

10-store-entity-updated

Exporting the store as a Web API

In APISpark, a store can be made available from outside. To make it available to external clients, we need to export it as a Web API. This operation will add a Web API in front of the store. This way, clients will be able to interact with the store using the HTTP protocol.

To achieve this, simply open the tools menu of the store and click on the “Export web API” item.

11-entity-store-export

This will open a form to fill details like Web API name and its domain. You can also set some parameters to customize permissions, collection and representation formatting.

12-entity-store-export-form

By clicking on the “Add” button, we actually add a Web API that is linked to the SQL entity store. This means that the Web API will leverage this store with its processing.

13-sql-web-api

The Web API automatically maps the entities of the store by adding corresponding representations and resources to manage them. Because of the naming conventions used in the database, the resource paths are not very URL-friendly. Let’s see how to make things cleaner.

Updating entity names

At the moment, there is an exact match between entity names and the names of corresponding tables. The main consequence is that when exporting the entity store as a Web API, the entity names will be used to generated the corresponding resource paths. As seen in the previous section, corresponding paths weren’t.

Since a SQL entity store is an abstraction in front of an SQL database, it provides an indirection level. This means that it maps entities to tables and it’s possible to have different names for entities and tables. The store will handle the mapping.

To update entity names, simply go to the entity details panel, update the name and update your changes.

14-entity-store-update-entity

We can notice that the updates of the entity store were automatically reflected to the Web API. In our case, it was simply following to an entity name update. This feature is particularly convenient to keep Web API and entity store consistent. With these two updates (one per table), our Web API looks now like this.

15-web-api-after-rename

This approach is also available when synchronizing the entity store against the target. For example, if a table is added, both resources and representation will be also added.

This feature is particularly interesting since often, table names use prefixes like “T_” in SQL databases or similar conventions.

The last thing we need to do is to deploy the Web API and we will be ready to use this Web API to update our data only using the HTTP protocol.

Interacting with the service

At the HTTP level, we are able not to use straight SQL. No need to worry about drivers, or SQL syntax though. We will show here how to interact with the data from the database.

To get data, simply use the GET method for endpoints /customers/ or /orders/. The received content will leverage the JSON format by default. For our API testing purpose, we will be using the DHC web API client, as show in the screenshots below.

16-dhc-list-customers

In the case of the orders, we can filter the list using the query parameter customer_id.

17-dhc-order

To manage a particular element by its identifier (the id field in the database), we can append this identifier to the URL. For example to update an element, we can use the PUT method with the URL will be /customers/1 and a JSON payload containing the new name of the customer.

18-dhc-customer-put

Other methods are available for other operations: POST to add a new element and DELETE to remove elements. For more details you could refer to the overview page of the Web API that is self-explaining. If you want to know more about REST principles behind Web APIs, you could have a look at the article “Designing Web APIs with Restlet Studio – Web API concepts”.

Conclusion

In this article, you saw how to implement a data-driven Web API based on a SQL database. We described all these different steps to create and configure both entity store and Web API within the APISpark platform.

We emphasized how it’s easy and quick to create such Web API by letting the platform introspect and create corresponding entities. APISpark also allows us to customize the generated ones by default to tailor the Web API contract to our needs.

We also showed how the use of APISpark is convenient and flexible to make consistent all the parts when the schema is updated.

CTA_free trial_3