Nutzen Sie die Chancen von KI - mit der richtigen Qualifikation durch Spirit in Projects

The age of AI: the Future’s in your hands!

The future is right now! Dive into the the world of artificial intelligence (AI) and find out how it’s revolutionizing the IT landscape. In this article we explore how AI is relevant, and why expert knowledge in a variety of specializations is the key to success.

The meteoric growth of artificial intelligence (AI) has turned our world around and permanently changed the IT industry. In an era in which data has become our most valuable currency, AI is the very heart of our digital transformation. It’s pushing forward innovations at an unprecedented pace and making it possible for companies to increase efficiency, understand their customers better and develop groundbreaking solutions.

Development of AI

In recent decades, artificial intelligence (AI) has experienced significant advances in development which have fundamentally changed the way we see technology and business processes. AI’s early years (from the 1950s to 1970s) focused on symbolic AI where rules and symbols were used to simulate human intelligence. This led to the development of expert systems which had the ability to use specialized knowledge, specified in the form of rules.

The next phase, known as the “AI winter” (from the 1980s to the early 2000s) were marked by disappointments and financial setbacks, as the high expectations of AI were not being met and many projects were suspended.

The revival of AI began in the late 2000s, as advances in machine learning and the availability of huge volumes of data opened up new possibilities for the technology. Machine learning made possible algorithms which recognize patterns in data and develop models automatically. In particular, deep learning with neural networks led to groundbreaking advances in fields such as image recognition, natural language processing and autonomous driving.

Recent decades have seen the integration of AI technologies in a wide range of applications, ranging from personalized recommendation systems in social media all the way up to medical diagnosis and autonomous driving. In addition, ethics and governance in relation to AI has became increasingly important, since the technology has raised issues related to data protection, bias and responsibility. Finally, the development of autonomous systems such as self-driving cars and drones has fundamentally changed industries such as transportation and logistics.

All in all, AI has grown from a theoretical concept into a practical reality which has profoundly influenced our lifestyles as well as well as the way we do business. Focus has shifted from symbolic AI to data-driven approaches like machine learning, which in turn has led to significant advances in the performance capabilities and applicability of AI technologies.

The AI revolution: Where are we today?

Today, even mainstream users have access to a wide number of technologies such as large language models (LLM) and image generators, and AI is a hot topic in the media. AI’s areas of application have never been more diverse. For example, it’s used in healthcare to help physicians diagnose complex diseases. And in the automobile industry, it’s working to bring self-driving cars to the road. The ethical and societal repercussions of AI are currently the subject of intense debate, and the way in which we address these issues will have a significant impact on our future.

AI is more than just a technological innovation – it’s changing the way we live and work, and presenting us with challenges that we must overcome together.

Karl Schott, Gründer & CEO

Importance of specialist knowledge

Multi-disciplinary teams which bring a wide variety of qualifications play a crucial role in determining whether or not a company uses AI successfully. Such diversity makes it possible for a company to consider AI applications from different perspectives and to develop interdisciplinary approaches to finding solutions. AI projects require expertise in a wide range of fields, including data analysis, machine learning, software development, ethics, design, project management and domain expertise. A multi-disciplinary team has the ability to better manage this diversity and to tackle a variety of challenges, from data collection and processing through model development and ethical assessment all the way up to implementation.

And since customer orientation is also of great importance, a diversified team has the ability to better empathize with the needs and expectations of the company’s customers and can develop AI applications which are truly of value. In addition, ethical and legal experts can see to it that the company’s AI applications meet ethical standards and legal requirements.

A team of this nature should be well-founded on solid technical qualifications such as data science and machine learning, software development, expert domain knowledge, ethics and governance, UX/UI design, project management, communication skills and last but not least a touch of creativity.

And for management, it’s essential that they’re aware not just of the opportunities afforded by AI but also how it works, so that the company uses the tools which are the right ones for its success and trains its employees accordingly. In addition, it’s always more relevant when making strategic, data-based decisions to be able to correctly interpret the corresponding data and decision processes.

The key: Up-to-date qualifications

The key to taking full advantage of this historic technological change is ongoing training which ensures that not only your company’s employees but you as well always have the right qualifications. That’s exactly where our current training program comes in.

We provide expert, proven, state-of-the-art knowledge in specialty areas of technical development which will reinforce your technical skill set. This is essential, since a solid technical foundation is a crucial starting point for managing innovation. At the same time, our newly designed innovation courses which focus on AI will ensure you apply your know-how in line with what’s happening right now.

Put together your own personal program of training to make optimal use of the possibilities provided by modern technologies. Now’s the perfect time to take charge – proactively shape your professional future by taking advantage of the opportunities afforded by AI!


Our trainings on artificial intelligence:

Data Journeys - effektive Analysen mit Spirit in Projects

Data Journeys: A Path to Better Digitalization for Companies

In today’s digital age, companies are under increasing pressure to digitalize their processes and data in order to stay competitive and meet customer demands. However, the digitalization process can be complex and full of challenges, and companies often find it difficult to embark on that journey effectively. Detailed analyses of business processes require a great deal of time as well as experience in order to effectively and efficiently achieve results in that area.

Spirit in Projects has developed a method here in which we examine the analysis from the standpoint of data touchpoints, since data is the basis for digitalization projects. A “data journey” is thus a trip made on the basis of the data processed in a company. In the process, it doesn’t matter whether the data concerns digital data twins of physical objects or a company’s processes which are entirely data-oriented.

Solution Approach: Data Journey

A data journey is a comprehensive approach for digitalization which uncovers where and how data are produced, collected, saved, analyzed, re-used and interpreted. This is thus implicitly reflected in a company’s value creation process and business processes.

To put it simply, a data journey analyzes the following aspects:

  1. Data creation: Data are produced through various sources such as users, applications, sensors, social media, transactions, etc.
  2. Data storage: Data are stored for later use using a variety of methods – in a file-based manner, in databases, data lakes or in the cloud.
  3. Data processing: Data are produced then processed in order to cleanse and organize the information and convert it into a suitable format. This phase may also include a combination of data from different sources.
  4. Data analysis: Data are analyzed using a variety of techniques such as statistical analysis, machine learning and visualization in order to uncover patterns, relationships and knowledge.
  5. Data visualization and communication: The findings and results of the data analysis are shown in some visual format (e.g. diagrams and charts) in order to effectively communicate the information.
  6. Data-supported decision-making: The findings and results of the data analysis are used as a basis for decisions and actions.
  7. Data element value creation chain: Data are often enriched over time so that they can be used as data objects in a business context.

Nevertheless, the data journey doesn’t end with the analysis of what is produced and processed and where this happens. On the contrary, it involves finding out where and at what point data become redundant (possibly through different methods of creation, different applications, etc.), where in the process data objects are lacking, what organizational units require access to data objects and what kinds of information requirements they have and what benefits can be derived from the data.

Data Protection as a Core Issue

The legal aspect also plays a role in this process. Are the data being processed personal data? Does the company have rights of use and exploitation? This question often comes up when the data creation process occurs outside of the company (e.g. architectural plans, address lists, statistics, etc.)

Using an analysis and perhaps a visual depiction of the data journey, a company can get started on a targeted optimization of its business processes and digitalization projects and quickly reap benefits for the organization.

Example for the visualisation of a data journey

The Advantage for Your Company

All in all, a successful data journey can help a company to achieve its business goals faster and more efficiently. It can contribute to improving the quality of products and services, raising customer satisfaction and, when all is said and done, increasing sales.

A data journey also helps companies to set priorities for their digitalization efforts. By considering the various aspects of the journey, a company can determine which parts of the process are most important and can assign resources accordingly. This ensures that the organization makes use of its resources as effectively as possible and that the digitalization process is moved forward as efficiently as possible.

When all is said and done, a data journey helps a company to ensure that its digitalization efforts are in line with its general business strategy. A comprehensive approach to digitalization makes it possible for a company to ensure that the processes and systems it establishes are not only effective, but also correspond to the company’s general objectives.

In conclusion, a data journey can provide a company with a timetable for successful digitalization. A company which implements a data journey concept will be better equipped to cope with the complex road to digitalization and to achieve its goals. The experts at Spirit in Projects are always happy to help you develop a data journey for your company.

User Experience and User Interface - Know How by Spirit in Projects

User Experience: More than Just a Pretty Interface

The rise in digitization is resulting in an increasing number of applications which aren’t just optional for users, they’re mandatory, which means that user interface design, and in turn issues related to user experience (UX) and user interface, are now moving into the spotlight.

Just think of the check-in process for low-cost airlines, which is increasingly being provided only online, or the ability to make contact with public authorities outside of business hours. This was completely impossible just a few years ago, but today is part of every digitalization strategy. And just over this past year, many of us are ordering food by smartphone, since more and more restaurants are no longer providing this service face-to-face.

User Interface as a Central Point of Reference

This increasing (and sometimes exclusive) use of online methods as a way to make contact with our customers means that user interfaces are replacing personal customer contact – the friendly word, small ways of showering attention on customers and interpersonal relationships in general – and as a result, our websites need to make customers feel at home.

What Makes for Good UX Design?

The obvious conclusion is that a good user interface must be modern and clearly structured, and must always be accessible by all users, including the disabled – this last requirement goes without saying. However, in reality many applications don’t focus too strongly on the first two items.

In particular, a good user interface must be appropriate to its task, and must also be able to helpand guide users through every step of the application, without being patronizing. Interface evaluation cannot take place in an environment which is far removed from reality, which unfortunately is often the case in software tests. For example, actual users will use their smartphones right in the middle of the street, use their tablets as they’re relaxing in front of the TV, or may just have something better to do than to muddle through the system of user guidance we’ve come up with. We need to be able to meet users at whatever point they actually use our application, and to be aware of how they interact with it.

As a result, it’s an absolute necessity to have a solid knowledge of not only our users, but also their environment, their suggestions and their usage habits, and develop our applications in line with that. Although a pretty user interface is certainly a great thing for customers and developers, it can be frustrating for an actual user if it’s not also appropriate to its task, helpful and in general a satisfying experience.

Key Factors in UX

User Experience describes a user’s perceptions and reactions which arise as a result of the use or presumed use of a product, system or service. Achieving a positive user experience requires work on a wide variety of aspects, which must eventually also brought into harmony with one another.

The following areas in the field of User Experience are especially receiving attention these days:

  • Information architecture describes structuring and organizing information in such a way that it’s easy for users to collect, evaluate and use.
  • Interaction design is concerned with the various aspects of human-machine interaction. In addition to technology as well as the timing and sequence of events involved in an interaction, its social and emotional aspects are receiving greater consideration and being included more.
  • Usability describes how an application will support users in the given use context.
  • Visual design, or UI Design, is the UX area responsible for the aesthetic design of the final product.

Considerations related to User Experience must begin right from the start of a project, involving close work with usability experts (first as a business analyst and later as a requirements engineer) to identify and prepare the groundwork required for optimal solutions.

We’ve just released a new training package which focuses on Usability and User Experience, and is based on recognized standards in the UXQB field. These courses will give you the tools you need to meet digitalization project challenges related to User Experience.


Learn more about our trainings on UX/UI:

Digitalization, Digitization and Digital Transformation – Are you ready?

Digitalization, Digitization and Digital Transformation – It’s Now or Never!

Digitalization – digitization – digital transformation – what exactly are the differences between those terms, and why do you need to know how they’re defined? It’s obvious that information technology has radically changed not only work life, but also day-to-day living in general, and a number of terms which use the word “digital” are being used to describe that phenomena. In this blog, we clarify how IT professionals use those different terms, and what that means as a result.

Up to just a short time ago, digitalization was still the accepted way in any modern corporate strategy to make an organization fit for the future. However, it has become an absolute necessity over these past few months, and in these special times is providing companies with the flexibility they need to continue to work productively and in an orderly fashion in spite of all challenges.

A company’s digital fitness has now become a key factor in its resilience to crises, and now might be the last good chance for many businesses to take the right steps to ensure they don’t fall behind and the competition becomes uncatchable. As a result, it’s crucial you know the three levels of digital change, and are able to define, based on your own business, an honest self-assessment as well as the next steps to take.

What is Digitization?

Digitization describes a change at the beginning of the process. Digitization describes the receiving of data represented in digital form so they can be read and processed by computers. The term therefore describes a technology which has existed since the very earliest days of information technology. It’s only through digitization that the information which exists in the world can be made usable for IT systems.

Here, it’s definitely possible to make the sweeping statement that as the availability of quality data from the greatest possible number of sources increases, the better these data can be then be evaluated, tied together and cultivated for decision-making purposes. The concept of data wealth is applicable in this regard. The challenge lies in obtaining data of good quality. Digitization techniques are constantly improving.

Currently, the use of AI (artificial intelligence) is essential for progress in this field. Artificial intelligence has been used for a number of years for text recognition in printed documents, with natural language recognition being another important application of AI. And recognizing objects in images and correctly interpreting scenes taken by cameras is possible in digitization only through AI.

In addition, digitization will also remain an essential task in the future so computers can record reality outside of their own digital worlds. Even today, sophisticated sensors make it possible for computers to see, hear and even smell better than a human ever could. Areas of application such as autonomous driving, medicine and agriculture are key examples of this field of application.

What is Digitalization?

Digitalization refers to a change to an overall process. The term describes the comprehensive use of information technology in the implementation of business, production and service processes, comprehensive meaning that computers are used in a process from start to end. The necessary data are passed digitally from one step to the next, or are already available digitally, and there are no media discontinuities in processing.

Digitalization requires that companies convert their technical ecosystems to information technology, and it’s essential that all functionality and relevant data can be used anywhere they’re required, without technical barriers.

One significant consequence of digitalization is an explosive increase in data, which can be explained by the fact that data are not only processed automatically, but the vast majority of data today are also being created automatically. Big data methods and technologies are addressing this challenge. Data needs to be made usable, since more and more political and business decisions are being made quickly and reliably on the basis of information which is detailed, dependable and up-to-date.

What is Digital Transformation?

Digital transformation goes even further – the term refers to a change in business models, customer relationships and even social structures. This is accompanied by changes in market and business structures which are based on the use of information technology. These extremely rapid changes are having so sweeping an effect on existing processes and systems that the term disruptive is often used instead.

Digital transformation is changing society, business and the world of work, and innovations and disruptive changes must be considered the main factors responsible for this. Technology and its use are integral in determining whether an organization or company will be successful over the medium term.

Examples of digital transformation are by no means new, since the process got underway in the mid-90s and is gaining momentum more and more. Examples of the phenomenon of digital transformation include the online mail-order industry, Internet banks and insurance companies, arranging personal transportation via app, streaming providers in the field of media, the use of social media for targeting advertising, and much more. In our opinion, the fact that many activities which previously involved at least some travel are also being carried out via web conference is likewise leading to permanent changes in the travel industry, in particular when it comes to business trips.

In light of established examples of digital transformation, it becomes apparent that speed is more important that company size and technical competence is more important than capital. A important point here is the improved possibility of obtaining cost-effective, highly scalable and globally available IT infrastructures, made possible through public cloud services.

Bottom line: Digitization isn’t enough – go for transformation

Even our quick glimpse of the differences between digital concepts shows that digitization alone does not go far enough, and it’s clear that reliable data are an essential condition for all processes further down the line. Digitalizing processes can often boost potentials for savings as well as customer satisfaction. However, to survive over the medium term, companies need to review their business models and adapt them with an eye to digital transformation. In order to remain OR become a leader in the digital transformation race, you need to keep an eye on the following items and build up a corresponding skill set in your organization:

  • Agile management and design thinking
  • Cloud technologies
  • Artificial intelligence
  • Big data

And as experts in digital innovation and technology, we can help you do exactly that.


UNSERE TRAININGSANGEBOTE ZU INNOVATIONSTECHNOLOGIEN:

Artificial Intelligence (AI) in Medicine has numerous applications

Artificial Intelligence (AI) in Medicine

The term artificial intelligence (AI) comes up in the IT world time and time again – but what does it actually mean? The truth is, there’s no single generally accepted definition for artificial intelligence. In principle, the term describes systems which have the ability to perform activities normally associated with human intelligence, such as learninglogical deduction and understanding and reacting to one’s environment. AI isn’t one specific program, algorithm or something of that nature, but instead represents a combination of various methods used to carry out those complex tasks.

AI is also being applied increasingly in medicine as well, and many medical fields, including diagnosis, treatment and drug development, are already taking full advantage of the technology. However, the right AI method must be used in order to achieve ideal results.

Sissi Zhan, Medical Informatics Engineer and Consultant for Spirit in Projects

One area in which great hopes are attached to the application of AI is medicine. As a result, we’ll take a closer look at a few possible applications and the potential they bring.

AI Applications in Medicine

Diagnosis

Diagnosing diseases requires great knowledge and experience on the part of medical experts and consumes much of their time. Systems which use AI have the ability to learn that capacity, by receiving training which is based on large volumes of data.

Most AI applications for medical diagnosis are based on analyzing images and identifying the image content being depicted. Image analysis is frequently used (e.g. in X-rays) to identify characteristics which indicate specific diseases, and some of that technology is already quite advanced. For example, some AI systems already have the ability to identify, using skin images, cases of skin cancer with high accuracy. Those applications are based on image recognition and image processing. Key AI methods, which have led to extremely successful systems in this field, include neural networks and deep learning.

In addition to image analyses, diagnosis systems which are made available as chatbots and “communicate” directly with physicians and patients, are also in wide use. Such systems record a patient’s symptoms and use their underlying knowledge base to ask other relevant questions, which then leads to a diagnosis. To that end, methods for analyzing speech are used in combination with rule-based systems (such as expert systems) which emulate the learning and decision-making capacity of an expert.

Treatment

A suitable treatment strategy is established following a diagnosis. However, medications affect each patient differently in terms of efficacy and adverse effects. As a result, one individual treatment strategy for each patient would be ideal.

AI can help here, by combing through data and knowledge, and analyzing available health data and a patient’s case history to facilitate selection of the proper medication. For example, an AI application can process data, quickly see that a patient was previously unable to tolerate a certain medication, then suggest to the physician that a different medication be prescribed.

In addition, AI can help ensure that treatments are performed in accordance with the latest research. Such applications first analyze existing medical literature, then compile the latest findings on treatments and medications, making it possible for medical personnel to come up to speed on a treatment even if they have little time available.

Systems of this kind must have the ability to read text, identify the relevant content, then derive rules from it. As a result, they also need to be able to understand language as well as its structure and meaning. In addition, these systems need to build up a base of medical knowledge which gives meaning to any text content they find.

Drug Development

Normally the development and approval of drugs  is a process that lasts for several years and is very costly. In addition, only a few medications successfully pass all test phases. AI can significantly speed up and improve this process, by supporting various activities over the entire drug development process.

The first step in the process is to identify what molecules (or so-called targets) need to be eliminated in order to treat a disease. A search for active ingredients which are effective against those pathogens is then carried out. AI can be applied in and can speed up both of these stages. It then becomes essential in subsequent pharmaceutical studies to find suitable participants, for a number of reasons including avoiding the distortion of study results. AI again can aid in the search for suitable subjects.

In contrast to more basic computer programs, AI systems can use machine learning to go through the multitude of available data, and can also process data which are structured differently. Systems then identify relevant patterns in the data and learn which characteristics indicate targets that show promise. Similarly, the systems can also predict the effectiveness of different medications on the target identified, and can also find suitable study participants.

Summary

Just the above examples of medical applications show how AI methods are already being used with success in fields which require the application of learning to especially large volumes of data. Before planning and defining your own AI application, it’s essential to have a solid overview of the various methods which are available in that field, as well as their possibilities and limits. It’s precisely such an overview which you’ll be able to acquire through our well-founded, practically-oriented AI training courses.


Learn more about our trainings for innovation technologies:

Spirit in Projects - Your experts for cloud architectures

Service Scaling Through Native Cloud Computing

Being in cloud is just a start – it’s even more important to take advantage of the special opportunities provided by cloud computing, one of which is the almost unlimited scalability of the applications and services you make available through the cloud. However, you first need to make sure the services you use are cloud-native and use orchestration services such as Kubernetes. The following overview of the architecture of one such system was prepared by our cloud experts.

First, the software and libraries required at the operating system level must be bundled using so-called containers. Containers use the kernel of the host operating system, but are separated from other containers and from the host at the process and file level. In addition, containers can be precisely configured in terms of the amount of resources they consume.

Operating these containers within a cluster of virtual or physical computers requires the use of a container orchestrator. Kubernetes represents the dominant container orchestrator on the market (with a market share between 80 and 90 percent, depending on the survey in question); Docker Swarm and Apache Mesos provide alternative orchestrator options but only play a minor role on the market.

Kubernetes was originally developed by Google and was released in 2014 as an open source platform for the automatic deployment, scaling and management of containerized applications. Today, Kubernetes is managed by the Cloud Native Computing Foundation (CNCF). The CNCF was founded in 2015 as a non-profit organization, and is part of the Linux Foundation, which means that the continued development of Kubernetes is ensured well into the future.

A business looking to use Kubernetes can either install it itself on-premise or in the cloud, or can make use of a managed Kubernetes cluster,  though market research shows the number of self-managed Kubernetes installations is dwindling, while the use of managed Kubernetes cluster services is growing greatly.

Architecture of a Managed Kubernetes Cluster

A Kubernetes cluster (K8s) consists of 2 components:

Source: Microsoft Azure documentation
  • The control planeis responsible for controlling the K8s and is managed by the provider, who often even makes it available free of charge.
  • Nodes, on which the applications packed in containers run. Nodes on which applications are deployed are managed by the actual customer, and like virtual machines are billed based on processor type, number of virtual VPUs and RAM. Nodes are grouped into node pools.

Implementing a Managed Kubernetes Cluster (K8s)

A K8s can be installed in a number of ways – via command line, through the cloud provider’s portal or by using tools such as Rancher or Terraform.

The number of nodes and Kubernetes version is specified during installation, and some providers make it possible to specify a higher availability for the control plane. The K8s then becomes available after 5 to 15 minutes. K8s performance can be modified at any time by adding or removing nodes. As with VMs, a provider will bill the nodes configured in a K8s independently of their utilization. Autoscaling is one option available for optimizing costs.

Autoscaling in the Cloud

Autoscaling can be used to automatically provide an application with additional resources (CPU and RAM) as needed. Administrators can configure the related rules in advance, using parameters such as number of requests, CPU/RAM utilization, etc., so that manual steps during operational management are unnecessary and the application reacts virtually immediately to changes.

Both upscaling as well as resource reduction can be performed automatically, meaning both the number of nodes is automatically adjusted (based on configuration) and applications/services in additional instances are started or stopped. This is made possible through so-called pods.

pod represents the smallest deployable object in a Kubernetes cluster, and one or more containers can run in a pod. A pod represents one instance of an application, and an application can be upscaled (= horizontal scaling) by starting up additional pods.

Practical Example of Autoscaling in the Cloud

The example above clearly shows how the autoscaler begins to start up additional pods (graphic at lower right) when the critical processor load defined by the administrator is reached. However, since the current nodes are above the load limit and additional pods cannot start up (shown by the small red peaks in the graphic to the lower right), additional nodes must first be started up. The application will then be able to manage the increased usage volume with the help of the additional nodes and pods.

Our Goal: A Cloud Strategy for Your Company

Before you can develop a complete cloud strategy and implement it meaningfully for your business, you should become familiar with cloud-native methods. A basic understanding of how scaling works in the cloud will help you to better identify application cases and processes that can benefit from it. A scalability approach which is simple yet comprehensive is one of the many ways you can get the best out of the cloud, and our courses and consulting services can show you how you and your company can make optimal use of the opportunities provided by cloud computing.


Learn more about our trainings on innovation technologies:

Cloud Management – Possibilities and Success Factors

Cloud Management – Possibilities and Success Factors

Moving IT services to the cloud is currently a key issue for many companies. A critical factor in this move is not only making applications, databases and network structures cloud-appropriate, companies also need to consider how they’ll administer their cloud services in the future. In this blog article, we’ll share with you what we’ve learned from our practical experience regarding possibilities and success factors for good cloud management.

Using cloud services, such as the well-known public clouds offered by AWS, Azure and Google, provides businesses the following advantages:

  • The resources they need can be scaled.
  • Costs and administration are transparent and are traceable.
  • Specialized services such as AI, blockchain and specialized caches are easy to access and use.

Case Example: Moving a Company into the Cloud

Let’s take the example of an international Austrian-based company which up to now has used servers in the data center of an external provider, who’s also responsible for server administration and for installing the applications used there. IT and other services are billed on a monthly basis. Before converting to cloud services, the company needs to ask itself how this relationship should be regulated. The following three items are essential at the very start for successful cloud management:

  • Clearly defining the cloud administrator’s role
  • Using container technologies such as Docker
  • Making optimal use of scalability

Clearly defining the cloud administrator’s role

When moving up into the cloud, it’s a good idea to clearly define, and if necessary reorganize, interfaces between application developers, infrastructure providers (cloud providers) and the eventual cloud administrator.

The role of cloud administrator can be assumed either internally, or by an external service provider. If that role is taken on by an external service provider, the subscription should not be in the name of the cloud administrator but that of the client, and should also have clear guidelines for the actual users of resources. This makes it possible for the client to directly receive billing for the resources used and therefore have complete transparency, through the detailed bill itemization, of the structure of the costs which actually accrued.

The subscription should give the cloud administrator access to resources, but not the right to change the subscription itself (e.g. methods of payment). Should it become necessary to make change administrators (e.g. because the service provider becomes insolvent or just performs poorly), this can just be done through authorization (users and permissions granted).

Using container technologies such as Docker

Because of the complications involved in installing software applications, the interface between application development and administration is often complex. Container technologies such as Docker provide a good solution to this problem, especially when a business is using cloud services.

Relying on container technologies such as Docker then makes it possible for cloud administration to concentrate on essential tasks, namely using cloud resources (processors, storage, load balancing, CDN, managed databases, etc.) to create a system environment which is secure, scalable, high-performance and cost-effective.

Application developers are then responsible for completely provisioning software on the basis of containers. Application installation becomes unnecessary, since the container contains the fully configured runtime environment. As a result, this also rules out installation errors.

Making optimal use of scalability

Since they can be continuously kept at constant capacity, it’s currently still more cost-effective to maintain virtual machines in traditional data centers than in the cloud. Nevertheless, a key attraction of cloud packages lies in the fact that those services which a business cannot maintain in its own data center in a comparatively cost-effective manner infrastructure can be easily scaled, throughout the world if necessary.

By using scalable Docker containers, a cloud administrator can react quickly to additional demand, and through just a few simple commands can start up the number of container instances which were started simultaneously, and reduce them again when necessary. This process can even be automated, by using container management systems like Kubernetes.

In order to use scaling efficiently, the related responsibilities must be clearly defined between application development and administration, with the use of containers being a must, and development’s responsibility being to provision scalable applications.

Making Use of the Potential Provided by the Cloud

Cloud services mean companies have access to powerful, scalable infrastructure tools, which if used selectively and in combination with good cloud management will yield all sorts of potential – and not just for large companies.

Any business looking to procure cloud resources must consider their vastly different technical and organizational conditions as compared to traditional IT systems. Spirit in Projects specializes in developing cloud concepts and technical documents for procurement processes and invitations to tender. We bring in our technical expertise and work with our clients to develop viable, sustainable concepts, prepare the necessary documentation and provide support in the procurement process. 


Learn more about our trainings for innovation technologies:

Agile Requirements Engineering – Success Factors for Implementing Agile Methods

Agile Requirements Engineering – Success Factors for Implementing Agile Methods

Think requirements specifications and functional specifications are too cumbersome and boring? You’re in luck! Agility’s the latest buzzword. Projects need to be implemented quickly. As a result, agile methods are what everyone’s talking about, and are being used for all stages of a project, from eliciting requirements to implementation. We’ve identified six success factors for using agile methods in requirements engineering and project implementation.

Agile methods are enjoying greater and greater popularity these days. Innovative digitalization projects require flexibility, not only in implementation but also when it comes to requirements, and projects that need to meet the challenge of ensuring agility in requirements elicitation and clarification readily rely on agile requirements engineering techniques.

What’s Agile Requirements Engineering?

The preferred agile methods for gathering and managing requirements over the course of an agile requirements engineering process include:

  • Workshops
  • Specifying results by means of user stories, epics, technical stories, etc.
  • Backlogs and kanban boards for managing requirements over the course of implementation and are provided either physically (central boards) or electronically (in tools like Jira, etc.)

Success Factors for Agile Requirements Engineering

These methods are relevant and essential, and their use leads to outstanding results in many projects – although they’ve also resulted in major problems for many other projects. Viewpoints on this subject can be quite extreme – some organizations invoke agility as the magic word that will solve everything, while others have had a number of bad experiences and have virtually forbidden use of the word in connection with projects.

Agility isn’t just a technique, it’s also a management philosophy when it comes to projects and businesses. Our experience shows that success doesn’t come from just making use of processes and techniques – there are success factors which are much more fundamental.

Wolfgang Hiermann, CEO and agile coach

Based on our experience in project management and consulting, the following six items have emerged as success factors in this area:

  • Product leaders, not product owners
  • Actually carry out short feedback cycles
  • Developers enthusiastic about the product
  • Expedition-friendly organization
  • Openness and transparency
  • Keep development close to the chest

These factors are the pieces in the puzzle for ensuring the success of agile requirements engineering and agile projects, and when they work in combination, they result in a functional whole through which agile methods and techniques can be used meaningfully.

Success Factor 1: Product Leaders, not Product Owners

Agility means flexibility, and flexibility is sustained by the willingness and ability of stakeholders to question goals (in particular usage goals), adapt decisions and change priorities. However, what’s most important is that stakeholders are enthusiastic about the product, and actively seek to shape it.

As a result, successful agile projects have not so much a need for a product owner who’s strictly administrative, (which unfortunately is often the case), but rather a group of product leaders who work in close collaboration and, as stakeholders, drive the project forward with foresight and energy.

Success Factor 2: Actually Carry Out Short Feedback Cycles

Unfortunately, many agile projects fail exactly because interaction between stakeholders and the agile team is reduced to a minimum after the initial workshop. To implement agile projects successfully, it’s not enough just to describe requirements by means of user stories and to manage them in backlogs.

Short sprints actually need to be carried out, a tangible result must exist by the end of each sprint and most important, the most relevant requirement stakeholders need to actually see that result, work with it and if necessary revise requirements and goals.

Success Factor 3: Developers Enthusiastic about the Product

Unfortunately, developer participation is often neglected for fear of goldplating. Yet, the reality is quite the opposite – enthusiastic developers don’t want to program something that nobody’s going to use, they want to witness users who love using their product, and even hear a sincere “thank you” for their efforts every now and then.

This is because agile flexibility in development is difficult if developers can’t form a true attachment to the result. A highly successful implementation of requirements and technology will be achieved only if developers are convinced of the product’s benefits and have the opportunity to actively take part in shaping requirements.

Success Factor 4: Expedition-Friendly Organization

“Changes are stressful. Nothing ever improved through changes. We’ll first wait to see what we’re up against, then we’ll deal with it.” – If that’s the client organization’s philosophy toward life, then the team members of an agile project will feel like extraterrestrials just before landing, and the potential for frustration in the project will be great.

That’s because the saying “In for a penny, in for a pound” applies when it comes to agile requirements engineering. Stakeholders and organizations who aren’t completely enthusiastic about the agile approach would be better off not to bother at all and to make use of more conservative methods – also meaning they’ll need to make do without the advantages of flexible development.

Success Factor 5: Openness and Transparency

“It’s not clear why we’re doing this job like this. We’ve always done it that way, so we need to keep doing it that way in the future.” Of course, you’ve heard statements like that. Sometimes questioning a process is perceived as being presumptuous, or even insulting. Many questions should only be asked in the right context, or sometimes even not at all. And all too often recommendations first need to be reconciled and decisions prepared in a small group of people before they can be discussed openly with all stakeholders.

The famous statement by Peter Druckers applies here: “Culture eats strategy for breakfast.” And this is the point at which agility (and in turn project democracy) dies. Effective agility requires open cooperation and mutual respect.

Success Factor 6: Keep Development Close to the Chest

It’s true – you can outsource even agile development. However, the greater the distance, the higher the effort and expense will be for agile project management and communication. Although the continual improvement of resources for videoconferencing and the live processing of documents are making work greatly easier, they’re no replacement for paths of communication which are truly short.

As a result, agile projects greatly prefer the use of nearshoring over offshoring. Nevertheless, top efficiency and effectiveness in agile projects is achieved by keeping distances between stakeholders and the development team as small as possible. Onshoring agile projects, or sometimes even better onboarding the agile team for the duration of the project, are successful methods in this regard.

Bottom Line: Culture and Know-how

If you take a look at these six success factors, you’ll see they have a lot to do with corporate culture and building up internal know-how. Making use of agility in requirements engineering and in projects requires both time as well as knowledge of agile methods. Project members must have a thorough know-how of the product as well as solid basic technical understanding.

This means that building up these skills within a team is an essential step for any organization moving in an agile direction, and we at Spirit in Projects, through our training and consulting, can provide you the support you need to establish agility in your organization as well.


Learn more about our trainings for innovation & agile methods:

Requirements Engineering and Digital Transformation - Current Trends

Trends in Requirements Engineering – Methodological Digital Transformation

“Why do we still need requirements? We’re agile!” Have you ever heard something like that? In these times of rapid innovation, just the term requirements engineering sounds pretty laborious. Nevertheless, just in view of the radical transformation of companies and business models, requirements engineering is playing a central role in managing digital transformation. However, making this possible also requires a transformation in the methods applied. As a result, in this blog article we present you with the latest trends in requirements engineering.

All too often innovation pressure, whether real or imagined (“We need to innovate!”), results in projects which get bogged down before they’re halfway done and result in horrendous costs. Why does this happen? Because the requirements gathered aren’t specific or accurate enough. This is because, as has been long accepted in the field of IT, requirements engineering is one of the most important factors determining the success of a project. As a result, many organizations have made great efforts to ensure that their requirements are correct and that they flow into their projects in a well-coordinated manner and at the right time. And in addition to conventional methods, those organizations are also relying more and more on agile processes.

The Challenge of Digital Transformation

So where does the challenge lie? In spite of sound methodological principles, projects which focus heavily on digitally transforming large companies or organizations often feel that conventional requirements engineering methods are inadequate. Those methods are sometimes seen as not being practical, and are also often considered to be too slow and excessive. Agility is apparently not always an adequate solution here either, since agile requirements engineering methods are sometimes considered to be relevant only to development, or they’re seen as too superficial and not suitable for complex tasks. In a nutshell, established requirements engineering tools fall short when it comes to managing digital change.

“These days, a requirements engineer is faced with great changes – approaches in this field have undergone fundamental change, and in some cases have even turned completely upside down.”

Karl Schott, CEO Spirit in Projects

Trend: The Moving Target

No one who works in the field of requirements engineering for an organization committed to digital change will be able to implement projects as they did previously and still be successful. The reason for this is that role of requirements engineer has fundamentally changed:

“More and more, requirements engineers are finding themselves back in digital transformation projects, and although those projects are driven by a vision and strategic goals, neither the path which needs to be taken, nor the actual benefits of the project nor the strategic technology for implementation actually exist, and need to be first worked out over the course of the project.”

Karl Schott, CEO Spirit in Projects

This is reflected in the problems of the industry, and in the current discussion going on in research surrounding the subject of ubiquitous requirements engineering. When it comes to innovation processes, the theme of requirements analysis forms a framework which surrounds the entire project.

Aspects of Ubiquitous Requirements Engineering

The term ubiquitous refers to the fact that requirements engineering is omnipresent in innovation projects. This results in a number of problems and changes which in our opinion account for the current trends in requirements engineering:

  • Open ended: Analysis projects for establishing technical ecosystems must be implemented without clear end criteria.

  • Holistic approach: A holistic approach will move into focus if the solution affects your environment so strongly in terms of technology and organization that clear boundaries cannot be established for the analysis.

  • Borderless systems: Systems cross borders in cases where the solution is being developed and used across regions or globally and participants are difficult to manage in terms of location and/or time.

  • Everyone: Many voices will be included in the conversation when the digital transformation results in the inclusion of an increasing number of stakeholders who are unable to provide expertise in the development of solution ideas.

  • Crowd: When it can no longer be determined whether users are key stakeholders or if it’s not even clear whether they’re actual people, the elicitation of requirements must be performed automatically.

  • Outside the comfort zone: When the problem to be solved can no longer be analyzed using an organization’s own know-how and skills, but solving the problem instead requires collaboration which crosses different domains, organizations and/or companies – that’s “being outside the zone of comfort”.

An important trend is also how the role of requirements engineering has changed for companies themselves – it no longer concerns “just” eliciting requirements, but instead bringing in possible technologies and solution approaches at a very early stage. The trend is moving towards requirements engineering providing even more:

  • Strategic consulting: This is because corporate strategies are relying more and more on the use of technologies to find and implement solutions.

  • Business enabler: In cases where the possibility of using technologies determines market success.

  • Orientation toward disruptive technologies: In cases where new technologies are being brought in to make it possible for a company’s new requirements to exist in the first place.

These trends point to a development we refer to as “new requirements engineering”. Our website contains detailed articles on our related practical experience we’ve gained from actual projects; incidentally you can also read more here about our consulting approach in the area of requirements management.


Learn more about our trainings for innovation & digital transformation:

Spirit in Projects offers expertise in agile methods and Kanban

Agile Methods