In Paradigma Digital we are passionate about programming (it’s our bread and butter) and we love to find out what are the latest advances in the most used programming languages so that we can offer our clients state of the art solutions.
However, it’s not easy to quantify which languages are being used globally. There are multiple tools on the internet that have designed algorithms in order to measure, in a relatively accurate way, which are the most commonly used languages and which have more developers backing them up and developing them. Amongst them, one of the most well-known is the TIOBE Index.
A PaaS (Platform as a service) is a Cloud application development platform. It facilitates life-cycle automation tasks, configuration, deployment, and software scaling for teams that focus exclusively on programming the things that really add value to the business.
The most well-known examples are: Openshift, Cloud Foundry, Google App Engine, Amazon Web Services or even Docker has recently brought out its own native solution.
At this point, we have heard a lot about digital transformation. At Paradigma we have several years of experience with the digitalisation of large companies in Spain and recently we have witnessed a certain confusion about how to implement the digital transformation. Where are we going? Where do we begin?
In this post, we intend to show what a digital transformation consists of, how to approach it and ,most of all, why it is a process which should be imperative for any organisation hoping to survive in the digital age.
More and more companies are opting for architectures based on microservices. These microservices are highly specialized, which is why it is often necessary to realise orchestrations of them in order to fulfil a business function. In response to this need, Netflix has recently released Conductor, a new product within the Netflix OSS ecosystem. This product implements a flow orchestrator that runs in cloud environments, implementing each of the tasks through microservices. Today in the blog we will see how it works.
For a while, technology made the difference and that product or service that offered more functionalities was the most likely to dominate a market. But this is no longer the case: we live in the “user era”, and we are no longer asked for a specific functionality, but rather expect “the best experience”. We are asked (almost required) to make, while being useful, a product that is, additionally, simple, nice…
This new approach is something that is already internalised in startups and purely digital companies. Good examples of this are Uber and Airbnb. The whole, the experience… that Uber and Airbnb give their users makes them feel VIP and that is what allows them to make a difference with the rest of alternatives… It is not the mere “functionality”.
Most of the information that is produced today is done so continuously (sensors, transactions, interactions, user activity…). Giving a fast answer for Big Data processing is becoming increasingly important.
The most common way to analyse all this information is to keep it in stable storage(HDFS, DBMS…) for later periodically analysis through batch processing.
The main characteristic of streaming processing engines is that they are able to analyse this information as it arrives. We consider real-time as data streaming processing in the shortest time possible to perform an analysis of the processed information.
Among these recently emerges tools we emphasise Samza (developed by LinkedIn), Storm or Flink. This article will focus on explaining what is behind Flink, how and where it emerged, and how we can use it in projects that require the shortest response times.
Nowadays our clients are already digital clients, clients who search on Google and expect results in milliseconds, clients who shop at Amazon and expect their order to be delivered the following day, clients who do all of their banking online, clients who search for reviews before booking a hotel.
This digital costumer, who knows the advantages of digitalisation and is used to them, cannot be tricked any more. You cannot offer him a search engine that works worse that Google’s, a shopping process slower that Amazon’s, a bank that forces him to go into the office all the time or a hotel without a proper online service.
The problem is that, in this new digital environment, traditional companies feel completely lost, watching how their business, which had barely changed in the past few years, is threatened by recently created start-ups that are completely changing the rules of the game.
Well, in this context we can say that digital transformation means covering the gap between traditional companies and new digital consumers: A process of radical renovation at all levels, to operate in a more efficient manner, through a better use of new technologies and with new business models that take advantage of the new opportunities created thanks to digitalisation.
The term Internet of Things was first coined by the military in the 1980s. Yet there can be no doubt that it was in the recent years that the use of the concept and its applications broadened dramatically, thanks to huge technological advances.
The number of connected devices has been growing exponentially lately, and it is expected to exceed 50 billion by 2020. It is for this reason that many experts refer to the Internet of things as ‘the next industrial revolution’, one that will change the way we communicate with one another, work, travel and have fun—and it will also change the way governments and businesses interact with the world.
Amazon Web Services, Google Cloud Platform and Azure have become the main providers of Cloud technology today. Among the many different IaaS and PaaS solutions these providers offer, the components that offer specific solutions for the Big Data field stand out. In this post, we’ll analyse the Big Data-oriented tools offered by these three providers. We’ll also clarify the different components, such as storage, processing or intelligence solutions.
In this post, we’ll analyse the main Big Data-oriented tools offered by these three providers. We’ll also clarify the different components, such as storage, processing or intelligence solutions.
After the summer, the start of a “new school year” in Paradigma has been filled with new and interesting projects. Foreseeing this workload, we carried out an intense recruitment process in July and August in order to expand Paradigma’s Front team and meet the new demands, and thanks to this we have recruited very talented people. However, during the search process I couldn’t help but notice, in the most evident and heartbreaking way, a reality that I had already heard about but had not yet experienced in my own flesh: the front developer profile doesn’t exist anymore.