“Open source” is a mainstay in any conversation about computing–from the hobbyist individual to the enterprise organization. But as open source has grown, so have the ideas and ways of working that are considered “open.” The technology sector and our global population have benefited from truly open software, hardware, and standards. For a better, stronger technology sector, we have to remember the open part of open source.
This IDC Presentation will cover the Open Source Ecosystem Market and Trends and provide data and perspective related to the growing use of open source software in the industry. The industry has demonstrated remarkable resiliency and surprising growth during the COVID-19 pandemic, with open source software use continuing to accelerate through the period. The concerns related to open source software are increasingly centered on ensuring a secure supply chain and that reliable, trustworthy components are used in constructing solutions.
Companies are under a tremendous pressure of digital transformation to remain valuable in an ecosystem with an accelerated worldwide innovation. Organizations need software solutions that solve their problem with a rapid time-to-market. But at the same time, they need to keep the flexibility to adapt their plans to the changing conditions.
Open-source has already proven its capacity to provide reliable solutions in different software areas like operating systems or deployment pipelines. An acceleration of the open-source movement, in part driven by the CNCF foundations, brought a series of new solutions. The combination of such components along the software lifecycle are shaping Quality Engineering for continuous value delivery, presented in this talk.
For today’s business challenges, you need the best of both Data Lakes and Data Warehouse, where you can do more precise analyses on all data — structured and unstructured at the same place. By leveraging Apache Iceberg, companies are building Open Data Lakehouse architectures that delivers reliability and analytics performance for any data at scale. Run business intelligence, AI, and machine learning use cases on the same data with your choice of engine.
Our favorite platforms to write and share code, such as GitHub, GitLab, or Bitbucket, empower maintainers and contributors to efficiently collaborate on open source projects. However, these platforms don’t always feel adequate when dealing with security bugs. Correcting security flaws is a sensitive process. Creating a public issue or pull request about a vulnerability could expose users to attacks. Furthermore, funding, missing knowledge, and misaligned incentives are common challenges that hinder collaboration between open source maintainers and security researchers. This session will discuss the best tools and practices that can help bring two communities to communicate and collaborate better.
The European Union has set itself the goal of adopting digital policies that are aligned with people’s rights, Artificial Intelligence (AI) is not any exception. Free Software plays an important role in the development and further success of such technologies. Hence, Alexander wants to point out to specific demands that must be considered to be aligned with Europe’s ambition of building and deploying technologies that empower people while strengthening Europe’s economy.
A well thought-out development and use of AI focusing on the benefits of Free Software will foster innovation, boost the economy, enhance control, strengthen trust, and it will make Europe fit for the digital age.
In this talk, Alexander Sander will shed light at the latest developments in AI in the EU and the role Free Software plays in this.
A new research unpacks the Evolution of the OSPO based on previous OSPO survey insights and the learnings from some of the most noted open source leaders in the community. This research provides a set of patterns and directions to help implement an OSPO (Open Source Program Office) or an open source initiative within corporate environments. This includes an OSPO Maturity model, practical implementation from noted OSPO programs across regions and sectors, and a set of OSPO Personas, which drives differentiation in OSPO behavior.
The formation of OSPOs (Open Source Program Offices) can be analogous to when organizations first started to establish CISOs as a reaction to security incidents. The organizations that established these centers of security competency protected and armed themselves for a better future.
To help better explain the evolution of OSPOs, members from the TODO Group puts on an annual OSPO survey and helped put together a maturity model people can use for their organization.
During this presentation, Ana will walk through each of the stages of the model. The audience will be able to learn the different actions an OSPO should accomplish to advance in their OSPO journey based on the proposed model, and how to identify its OSPO persona.
The Open Food Network is a worldwide distributed network of organisations which offer a SaaS based on a collaborative, Free/Libre Open Source Software. For 10 years we’ve been building a networked ecommerce platform with both community and self-hosted ‘instances’ that operate in regions around the world. We’ve learned that if food supply chains are to nourish communities and regenerate the Earth we need to enable systems and processes that can nurture diversity - we need DIVERSITY AT SCALE. In software, diversity means a wildly complex feature set, which is hard to maintain. So we’re taking our learning and rewriting our data models to handle the kind of networked diversity that our planet and communities need. And we’re implementing data standards that can help coordinate the wider ecosystem of food supply chain platforms. Come and learn about how we’re pioneering new approaches to build tech that supports communities and the natural world.
Data mesh architecture is not simply a type of data architecture but a new approach for designing modern data architectures by embracing organizational constructs as well as data-centric ones, data management, governance, etc. To understand how it works, we need to understand the four principles of data mesah: domain-driven design (DDD), data as a product, data access, and federated data governance.
In this lecture, it will be presented Cabo Verde e-gov development and digital transformation journey, and its decision to release the source code of the primary, high performance, low-code development platform (IGRP) for more sustainable development and as a key action to involve the private sector.
Despite our best efforts, it’s difficult to future proof our digital assets and open source projects for accessibility standards. Our code and design may follow current standards, but will your careful compliance become obsolete when new guidelines are released? What happens when other contributors add features and documentation? What browsers or assistive technology agents do your consumers use?
We will review a holistic approach for the complete lifecycle of an open source project —including how to iterate on existing content without impacting the hard work you put in during the build phase. Much like SEO...your work is never done. It should be considered in the initial build, then maintenance, complete with sprint regression testing, etc
Attendees will come away from this session with: - How to get started on improving accessibility on an existing project - Tips on what useful combinations of user agents, browsers, devices, etc. to test - A list of free tools that a site owner/content author can use to be sure their website is compliant before and after a build - What WCAG and other guidelines are on the horizon.
Developing a Java application does not have to be boring. Did you find yourself wasting several hours to accomplish simple tasks? Quarkus brings back the developer's joy of writing code. See your changes in the blink of an eye? Done! Start a database automagically? Done! Run only the tests affected by your code? Done. Deploy seamless to Kubernetes? You got it! Create a native executable? Go get a coffee! Quarkus is all about developer experience. If you want to learn more, join this session and rediscover the pleasure of developing Java applications.
Repetitive manual test execution, in the long run, tends to inefficiency and failure.
We live in times where SDLC is much faster and doesn't wait for anyone. Multiple daily releases, teams developing in parallel and delivering everything in the same bucket, semi-controlled chaos. So, ensuring software quality nowadays is a race of mini sprints, within the sprint itself (which many teams have adopted as a methodological framework of work).
Automating small test cases in large number is the only way to keep up the pace, otherwise, quality assurance becomes an act of faith if we only delegate testing to human execution.
Open-source test automation tools have unlocked huge potential for development teams. Easy to install, accessible to non-technical testers thanks to a structure that makes use of natural language (keyword-driven), and that can be introduced as a checkpoint in every CI/CD pipeline helping developers instantly assess the state of their code on each deployment.
Only someone who does not want to wear a bulletproof vest in a modern battle would not want to take advantage of all the benefits of regression testing.
Robot Framework is one of these bulletproof vests that save soldiers in a battle from being hit by bug after bug. In this talk, we will perform a demo, in which we will demonstrate how test automation can be accessible to all and fast.
There has been a lot of discussion regarding the impact of open source on security. Some consider it safer, others – less so. The openness of an open-source product is a major point of contention for both the general community and security professionals. It is reasonable to question the impact of the source code availability for both the potential attacker as well as white hat hackers and security experts.
During the speech, Zabbix will present a transformational journey addressing secure code development that has been done up until now and is constantly ongoing. We will address points such as - what benefits do our users and internal developers get from the code openness? Is it possible to reduce the development costs related to using security tools while at the same time not losing the users’ trust?
Today, one of the main challenges for organizations is how to scale software development. They face problems with the increased complexity of software, complex organizations, distributed teams, coordination and inter-dependencies, managing and aligning for releases, silos, etc. Due to the COVID crisis recent trend of remote work, the problems became more critical and hard to solve, particularly finding out how to "return to normal" or set up a new normal.
In this talk, we will explore the development of the Linux Kernel and how it scaled its software development in a constantly changing and evolving context. By examining the development model, the audience will have insights into how it can develop, deliver and maintain a large-scale software product globally and diversely. Furthermore, we will understand how the community solved the related challenges by looking at which solutions are in place.
Linux Kernel is one of the world's longest and biggest open-source software projects. Currently 28 years of age, the project comprises an estimated community of 5000 to 6000 developers, with more than 26 million lines of code. It actively releases a stable version each 8 to 12 weeks and provides seven versions simultaneously (two stable and five long-term maintenance releases). The latest numbers say that over 85% of the world's smartphones and over 90% of the world's top internet web servers use Linux.