Applying Colombian nearshore IT talent’s knowledge to effectively implement complex solutions such as distributed systems has gained traction in recent years as part of the modernization and migration to the cloud.
For some time, it has become frequent in traditional architectures of business systems worldwide to find the difficulty of maintaining and scaling infrastructure components at the rate that data consumption grows while keeping optimal costs to be able to compete in today’s market.
Many American companies and many others from different countries have chosen in the past to start migration and modernization of countless of their traditional business used for various purposes for decades. Ranging from content manipulation applications (CMS), enterprise resource planning systems (ERPs), internal systems developed to suit the business needs, web applications, mobile applications, among many others.
It is common now to find millions of technical articles about modern architectures’ implementations with terms such as “monoliths migrated to microservices”, “adoption of Domain Driven Design patterns DDD” “Containerization and orchestration of new microservices”, “fully managed cloud services” among many others. All serving the same purpose, encouraging companies to take the step to transform their legacy systems with new developments. The majority of these references present quite convincing scenarios in terms of stability, maintenance, vertical and horizontal scalability, fault tolerance, and many other benefits that the transition would bring, among which the most attractive, low costs are highlighted.
Everything said up to this point is debatable and there would undoubtedly be so many viewpoints that it would be difficult to reach a consensus on the feasibility and cost-effectiveness for the transformation and migration of existing systems to the cloud. However, experience and knowledge in design, implementation plus deployment are the deciding factors between a Successful migration and a totally unsuccessful one. The rest of this article details a job done for a multinational company in the health sector that operates globally.
Having onboard more than 7 development teams working on new features to enrich the experience of its users, the American company included an extra team made up of nearshore Colombian developers with experience and key technical skills to develop a crucial component in its new phase of service-oriented architecture, a message hub. in a short time, this team started from scratch the construction of a service written in C # using the latest version of .NET Core (3.1). In addition to service, an administration interface was built using React, Redux, Redux-Saga, Material UI, among others, in order to provide the user with a clean experience of excellent performance and easy to use the technique.
The requirements for this central piece in the communication and orchestration of multiple services that make up the system, imply the possibility of interacting transparently with different storage systems such as Amazon S3, Dynamo DB, CouchBase, Redis and so forth. Containers are orchestrated by Kubernetes, thus having the flexibility to change the broker/storage combination through configurations established in Docker-compose and then translated into Helm Charts. In just 6 sprints of 2 weeks each, release 1.0 was achieved, fully meeting the objectives of the MVP, including the possibility of creating JSON schemas directly in the UI to establish the form of the messages to be sent through the broker. Through the claims obtained from the token, the security rules per actor and their routing are established in C #.
To ensure the correct operation and stability of the service built following a Clean architecture, 78% unit test coverage was achieved using XUNIT + moq in conjunction with around 80 functional test cases using XUNIT automatically executed by Jenkins in the cycle of Integration and continuous CI / CD delivery. Such a CI / CD cycle was created using Jenkins pipelines executed in Docker containers through the webhooks integrated in the GIT (Bitbucket) repositories. In addition to compiling the code, executing the unit tests, executing the functional tests, said pipeline is also in charge of creating and labeling the Docker images based on the Helm Charts definition for each tenant. Such images are automatically published in the container registry specified by each Tenant, including Docker Hub and Amazon ECR.
AWS Tenant example: Keycloak — Couchbase — SQS