The best way we can illustrate the benefits of the collapsed-stack approach is by sharing some of the results we’ve achieved with our own in-memory database and application platform:
A rich web-based GUI running on multiple desktop and mobile platforms offers a faster, slicker, simpler, more stable and responsive user experience. As an outcome, new users adapt to software more rapidly, and with greater pleasure.
Good modularity, compose-ability, and extensibility reduces time from production to customer hands, and enables instant response to emerging customer needs. With an in-memory computing platform, solution developers and consumers don’t need to make trade-offs between performance, modularity, and reliability— and instead, can build a solution with strong values for all of these points.
A data integration feature enables development of independent modules/apps without the need for app-to-app integration or universal bus and master data management orchestration. Instead, data and functionality are integrated organically (and with a minimal amount of code) on the level of the in-memory platform and user interface, allowing for their efficient collaboration.
Thanks to technology enabled for multiple popular platforms (like .NET, Node.JS, Java, Windows/Linux/OSX), newcomer developers are ready to go in 1 or 2 weeks, with the ability to create new functionalities and modify existing modules. Development of integration with existing modules is not needed and module/application development becomes 100% independent.
An empowered, declarative programming style and the elimination of unnecessary APIs produces fewer lines of code and bugs. A short-term investment in studying Starcounter technology transforms into the long-term benefit of a predictable development cycle.
Starcounter’s in-memory technology enables strong ACID properties, which are strictly required for business transactions, while a server-centric architecture with a thin web-based client provides high security guarantees. Technology is designed to eliminate opportunities for unauthorized access penetration, query injection, and other popular vulnerabilities met in todays ERP systems. The data binding model exposes only a mirrored (server-secured) UI representation to let the server consistently verify the input. Such properties are especially important in resource-sensitive domains such as trading, banking, and accounting.
By empowering in-memory computing technology, hardware and software costs are reduced up to 100 times. Customers can run their solution on 1 server instead of 100, which reduces costs for hardware purchasing and maintenance, cluster administration staff (system administrators, DBAs), software licenses to run the cluster, and reduces power consumption closer to nature-friendly levels. At the same time, back-end performance is increasing proportionally, offering mixed OLTP/OLAP workloads and responsive UX, unavailable with traditional data processing and integration approaches (RDBMS, MDM workflow, independent APIs, and similar).
Starcounter’s customer-proven in-memory database combined with the elimination of server fleets substantially reduces the risk of operational disaster (like data loss due to hardware malfunction, or software bugs), and a consecutive painful recovery. If a sever’s recovery is required due to power loss, it takes seconds to minutes for snapshot and log recovery, compared with the hours required for known conventional DBMS-centric solutions.
Sharing data allows it to be managed in a meaningful way, and empowers strong normalization coupled with multi-faceted data representation without redundancy and logic dissipation over modules. In addition, this eliminates the need for customized master data management solutions and services (like establishing master data, performing data cleansing, and de-duplication).
Enterprise system users will benefit from running highly modular solutions built with a full set of integrated modules, as requested by the business. Wrapper apps functioning inside the platform will allow for the seamless integration of legacy systems into the new solution, providing enterprises with a smooth migration path, as well as an escape from vendor lock-in situations.
The approaching Internet of Things era makes utilization of in-memory platforms and collapsed stacks unavoidable. To put things in perspective, moving from around 7 billion humans to 60 billion humans—users and devices— means we can expect an increase of at least 7 times more transaction loads. As soon as the year 2020, 26 to 50 billion devices are expected to be connected to the network.
While device interaction will be seamless for users, it shall, in fact, be an orchestration of device-driven transactions conducted via net-based apps operating on data. Device chains, federations, multi-party communications and the like—all these are IoT phenomena with added complexity to be considered. And the whole turns out to be greater than the sum of the parts.
All the above practically leads to an astonishing increase in transactional workload which API-based web services have to meet. If thousands of user connections per second might become a problem for a REST-service today, consider the level of impedance when it comes to responding to millions of devices per second.
Now is the time to collapse the stack and get up to speed with the future. Luckily, in-memory technologies have arrived in perfect time.