The Future of Engineering with SysGit: Embracing Post-Cloud Architecture
Post-cloud signifies a future where cloud computing is a fundamental, ubiquitous component, and the focus shifts towards optimizing, extending, and innovating beyond traditional cloud paradigms.
At SysGit, we believe that hardware engineering teams can leverage software development methodologies in every facet of development to field better capabilities, faster. This includes computing infrastructure. As leaders of the world's first post-cloud software package, we’re here to delve deeper into what post-cloud means and the future we envision for engineers tackling hard problems.
What Post-Cloud Isn't
To clear any misconceptions, post-cloud doesn't mean a future without cloud computing. On the contrary, we see distributed computing resources as foundational. Cloud computing remains essential, particularly for security. Post-cloud signifies a future where cloud computing is a fundamental, ubiquitous component, and the focus shifts towards optimizing, extending, and innovating beyond traditional cloud paradigms.
So what is Post-Cloud
Post-cloud architecture enables engineering organizations to maximize their existing infrastructure investments. These investments encompass monetary, time, and labor investments in selecting and building applications and microservices, security measures to protect data and achieve compliance, and data investments in ontology and APIs.
The applications that sit on top of infrastructure shouldn't require organizations to rethink these investments. Instead, they should leverage them, applying the collaborative and scalable benefits of software to bigger challenges and harder problems.
Evolution of Infrastructure
Pre-Cloud Era
In the pre-cloud era, on-premises was the norm. Organizations ran software on their servers, giving them full control over their IT operations and data. This allowed direct action to secure data and ensure compliance with regulations. However, physical data centers had storage capacity limitations, leading to rising costs and complexity as organizations scaled.
Cloud Era
The cloud era began with providers like AWS, Azure, and Google Cloud introducing distributed storage. Cloud computing offered shared, on-demand resources, making space management the cloud provider's responsibility. This model fueled tech startups by providing easy access to infrastructure needed to build and scale quickly. SaaS replaced traditional software licenses, and governments and large enterprises adopted cloud computing, creating secure architectures.
However, the complexity increased as organizations had to maintain security and compliance across various environments. Properly architected solutions in AWS, for instance, require numerous services, from authentication and data storage to API gateways and DNS lookup. Multi-cloud and hybrid cloud offerings emerged to address these challenges, yet they often added to the complexity.
The Challenge Today
Cloud computing hasn't reduced costs for many organizations. The rise of new services created a need for more in-house specialists. Subscription services often lead to unused computing power if not monitored carefully.
As Ruby on Rails creator and entrepreneur David Heinemeier Hansson put it, “Some things are simpler, others more complex, but on the whole, I've yet to hear of organizations at our scale being able to materially shrink their operations team, just because they moved to the cloud.”
Meanwhile, cheaper storage and increased machine capacity made on-premises solutions more attractive, reducing the complexity of security and compliance.
Organizations now seek platforms that do more than just store data. DevOps platforms like GitHub and GitLab offer centralized places to store, manage, build, and secure software. These platforms come with existing infrastructure investments, allowing organizations to leverage their cloud architecture without introducing new services or inventing a new ontology.
Lightweight Applications and Hardware Engineering
There's a need for lightweight applications that can run anywhere, utilizing existing security and data structures and interfacing with established APIs. This approach allows hardware engineers to operate with the same speed, collaboration, and scalability as software developers, applying version control principles to systems engineering.
However, deploying such solutions is complex. Organizations use various tools for requirements management, system capture, and data storage, often requiring third-party tools to connect and propagate changes. The rise of SaaS has led to proprietary relational databases and Rest APIs, inventing new ontologies and requiring extensive IT approvals for deployment.
Revised Shared Responsibility Model in a Post-Cloud Architecture
The Shared Responsibility Model emerged as the dominant paradigm for security. Pre-cloud, organizations owned all responsibility for security. With the advent of cloud computing, this shifted: cloud providers secured the software and environment, while customers retained control over their data and access. However, this division of responsibilities wasn’t as simple as it seemed. In practice, the Shared Responsibility Model placed more demands on customers, necessitating a complex middle layer to manage these shared responsibilities. This increased complexity underscores the need for a new model that inherently assumes these responsibilities and the associated architectural decisions.
Moreover, the cloud model carries inherent risks. Entrusting data storage to a third-party introduces both security and business risks. Organizations that have invested in constructing their architecture for security and compliance should not easily cede control to a third-party provider. Doing so can introduce vulnerabilities and complicate compliance efforts. A revised shared responsibility model, based on post-cloud or hybrid cloud architecture, seeks to mitigate these risks by retaining greater control over security and compliance, while still leveraging the flexibility and scalability of cloud services. This approach ensures that organizations can protect their investments and maintain the integrity of their security frameworks.
The SysGit Approach
At SysGit, our journey led us to this post-cloud moment. Initially, we followed the traditional path of inventing proprietary engineering ontologies and data models, integrating with existing tools, and creating robust security features. As we worked with partners in the Defense Industrial Base more closely, we found that there was considerable risk to introducing this model. Organizations want to work in an architecture that has already been designed for the high security and compliance standards necessary to serve our nation's defense. They want to move from vendor lock-in to open standards, leveraging existing APIs validated by the industry, and applying security features of their underlying Git provider.
Now, we know they want Git functions and workflows tailored for hardware engineering, maximizing investments in underlying Git infrastructure, and reducing cost and complexity at the infrastructure layer. The future is post-cloud, building on the lessons of cloud computing and delivering lightweight, scalable solutions that leverage existing investments.
Join Us in the Post-Cloud Future
Hard problems won’t be solved by SaaS alone. At SysGit, we’re committed to reducing complexity and costs, enabling engineering organizations to tackle bigger challenges and solve harder problems. The future is post-cloud. Join us.