Building on the Mass Open Cloud (MOC), a real-world platform delivering open source projects as services, OI Labs is and incubator for new approaches to the cloud focused on bridging the gap between operators and developers and delivering open source tools to build and run cloud, container, AI, big data and edge workloads efficiently, repeatedly, and predictably.
The efforts associated with OI Labs frequently overlap multiple open source projects – for example:
- Cloud in a Box: Prescriptive Cloud Installations which encompass monitoring, onboarding, offboarding, and billing/reporting. The initial implementations of this include:
- The New England Research Cloud (NERC) project by Boston University and Harvard University, integrating OpenStack and Openshift with standard Research Computing productions systems such as the ColdFront resource allocation management systems as a basis for a standardized research computing software stack.
- The Community Cloud , based on the Operate First Project incubated at OI Labs, is focused on incorporating operational experience into the development of software projects rather than as an afterthought.
- Project Caerus: Project Caerus is an initiative focused on bridging the gap between distributed compute and distributed storage platforms commonly used for big data and AI applications. Caerus aims to create a new open ecosystem that allows compute and storage platforms from different sources to operate in a concerted fashion to substantially improve application performance, resource utilization, and application developer productivity.
- Elastic Secure Infrastructure (ESI): A set of services/systems to permit multiple tenants to flexibly allocate baremetal machines from a pool of available hardware, create networks, attach baremetal nodes and networks, and optionally provision an operating system on those systems through the use of an associated provisioning service. The code development features a mix of upstream OpenStack work (ironic and networking-ansible) and custom ESI code.
- Project Wenju: Addressing the “last mile” challenges of moving AI projects into production.
- Project Taibai: Taibai takes advantage of a microservices framework to deliver a scalable and customizable multi-cloud platform that enterprises can use to simplify multi-cloud management and improve the efficiency of cloud resource utilization. The Taibai unified service portal allows users, administrators and operators to manage diverse cloud architectures and third-party integration across multiple data centers and public clouds. Your may view the introduction of the project at this link:Project Taibai
Read the full 2021 OpenInfra Annual Report here!
Latest posts by Superuser (see all)
- Exploring the Open Infrastructure Blueprint: Huawei Dual Engine - September 25, 2024
- Open Infrastructure Blueprint: Atmosphere Deep Dive - September 18, 2024
- Datacomm’s Success Story: Launching A New Data Center Seamlessly With FishOS - September 12, 2024