Building the Software Backbone of European Defense: Real-Time, AI-Driven, and Distributed

As Europe moves to reinforce its defense capabilities amid rising global instability, a structural shift is underway in how defense systems are conceived and built. Rather than relying solely on traditional primes and centralized platforms, the next generation of European defense is becoming real-time, AI-driven, interoperable, and increasingly enabled by specialized technology firms.

One example of this transformation is Bulgaria’s Wiser Technology (BUL:WISR), a software engineering company that has quietly become a key contributor to several strategic NATO and EU-led defense programs. Over the past decade, Wiser has participated in more than a dozen EDF, EDIDP, and NATO-aligned initiatives — many in close collaboration with European defense primes, leading research institutes, and universities.

Wiser’s software solutions, from real-time data streaming to AI-assisted decision-making and secure communications, support major defense programs such as NATO’s Alliance Ground Surveillance (AGS), the EDF-funded European Strategic Command and Control System (ESC2/E2C), and cyber defense and AI innovation projects like FaRADAI and EU-GUARDIAN.

In the NATO AGS project, Wiser developed a mission-critical software system that enables the Main Operating Base to process live imagery and motion data from airborne radar systems and ground stations in near real-time. In ESC2/E2C coordinated by Indra with partners like Leonardo and Rheinmetall, Wiser is contributing to the development of a multi-domain command platform designed to unify operational coordination across European missions.

Meanwhile, in MARTINA, an EDF project coordinated by Royal Netherlands Aerospace Centre (NLR), Wiser helps validate AI for satellite-based defense applications by delivering backend infrastructure for AI performance assessment and traceability. In ASTERION, the EDF-funded underwater communications initiative led by the Netherlands Organisation for Applied Scientific Research (TNO), Wiser contributes ultrasonic protocol expertise and system design for secure, multi-modal data transmission.

The company operates under rigorous military compliance frameworks, including certification to NATO AQAP 2110 and AQAP 2210, and holds national, EU, and NATO classified information clearances — a prerequisite for participation in high-level defense programs. These credentials have enabled Wiser to build long-standing partnerships with European defense leaders such as Rheinmetall, Hensoldt, Rohde & Schwarz, Indra, Leonardo, and others, reflecting a broader shift in EU defense toward agile, interoperable, and cross-border software integration.

This trend was underscored earlier this year at the German–Bulgarian Defence Industry Day, where Major General Stefan Schulz of the German Federal Ministry of Defense remarked: “Bulgaria is a proven and reliable partner with a growing strategic role in the Black Sea region.” He also announced the upcoming appointment of a Defense and Security Cooperation Advisor at the German Embassy in Sofia — a role held in only a handful of countries worldwide.

Wiser contributes not only through technology but also to the strategic defense dialogue at the European level. During GITEX EUROPE 2025 in Berlin, Dimitar Dimitrov, General Manager of Wiser’s Automotive, Aerospace & Defense Division, participated in a panel discussion on cyber defense alongside representatives from the German Ministry of Defense, the Geneva State Police, and the leading cybersecurity company Kaspersky.

Looking ahead, Wiser is a strategic partner of the upcoming Regional Defense Tech Summit, taking place in Sofia on October 28, 2025. The event will bring together over 500 stakeholders from EU institutions, government officials, defense contractors, and dual-use tech innovators to discuss:

  • The shifting geopolitical order and the Black Sea’s growing strategic role
  • Europe’s progress toward independent, interoperable defense capabilities
  • The integration of AI and emerging technologies in defense systems
  • New models of collaboration across national, industrial, and technological boundaries

From real-time surveillance to secure communications, Wiser Technology is helping to build the digital backbone of European defense. Its growing role in multinational projects signals a larger shift in how defense capability is developed: through agile, high-performance software systems engineered by specialized firms working in cross-border networks.

Europe’s security depends on a robust, integrated ecosystem of innovative and agile technology companies capable of delivering next-generation defence solutions at scale. We are witnessing a clear shift towards distributed innovation networks, where companies across Europe collaborate to develop and deploy Software-Defined Defence capabilities.

Dimitar Dimitrov, General Manager, Automotive, Aerospace & Defense

Driving Success with Expert SAP Solutions

At Wiser Technology, we’ve seen firsthand how SAP technology can be a game-changer — not just for operations, but for the entire business. When done right, it brings clarity, speed, and intelligence to decisions that matter. That’s why we don’t just implement SAP solutions — we help our clients unlock their full potential.

As a Gold SAP Cloud Partner with a growing list of SAP Quality Awards, we work side by side with companies looking to integrate, optimize, or completely rethink their lines of business. From digital HR to business process automation, from legacy infrastructure to cloud migration — we’re in it for the long haul, ensuring every step moves the business forward.

Enabling HR Work Smarter — and People Feel It

HR transformation starts with a single question: How can we make work better for people in the entire organization? For us, SAP SuccessFactors is a big part of the answer. We help organizations digitize the entire employee journey—from recruitment through career development, to retirement—using SAP SuccessFactors HCM and AI-powered tools like Joule.

Our team has helped companies in eight countries go live with 35+ SuccessFactors Employee Central and Talent Modules. We master the tech inside out, but more importantly, we understand the HR processes and how to make it work for real teams at scale. Whether it’s a green-field implementation, migrating from legacy systems, or introducing tools like our own Lights.E-file for digital signatures, we help HR teams move faster and support people better.

Connecting Business Systems in the Cloud

Efficient integrations can be tough. Different systems, different data formats, and everything needed to work in real time. That’s where our expertise comes in. Using SAP Integration Suite, we connect SAP to SAP and non-SAP systems — cloud or on-premise — so the right transactional data ends up in the right place, every time.

We build custom connectors enhanced with integration flows, manage APIs, and handle the behind-the-scenes complexity so our clients don’t have to. Whether it’s connecting business partners, integrating OEM systems with S/4HANA Cloud, or troubleshooting in real time, we keep things running smoothly in the background.

Building Apps That Fit the Business — Not the Other Way Around

Off-the-shelf isn’t always enough. Sometimes, you need applications built around important processes, not the other way around. That’s why we created our S/4HANA Cloud Applications Factory. With SAP BTP, we design and build cloud-native business apps from the ground up.

From architecture to deployment, our team works with latest SAP technology and tools like CAP, Kyma, Cloud Foundry and SAP Fiori to create apps that are fast, secure, and easy to use. We use generative AI with SAP to accelerate development, and we tap into no-code platforms like AppGyver when speed and simplicity are key. Everything is built to scale — and designed to make life easier for the people using it.

Making Legacy Systems Future-Ready

Legacy code has a way of slowing businesses down. It’s familiar, but it often doesn’t play well in a cloud-first world. We help clients modernize legacy SAP customizations, whether that means refactoring and optimizing code for HANA, identifying and adjusting simplification items, or transforming and migrating master and transaction data to the cloud.

We take care of everything from code analysis to testing (unit, integration, and user acceptance), so that businesses can move to S/4HANA Cloud with confidence — and without losing the know-how that makes their systems unique.

Automating the Work That No One Should Have to Do

Let’s face it — a lot of business processes are still too human-operated. Too many steps require employee effort and the is still too much back-and-forth.

At Wiser, we leverage latest SAP technology that enables powerful business process automation through native workflows, robotic process automation, and intelligent technologies within SAP S/4HANA and SAP BTP. For our clients we can automate anything from simple task routing to complex, AI-driven processes across functions.

We work closely with teams to identify process bottleneck, then design and implement automations for an increased business agility. The result? Faster delivery, fewer errors, and people focused on the work that really matters.

Scaling Digital Commerce with Confidence

E-commerce is evolving fast — and customer expectations are higher than ever. We help digital businesses grow and adapt using SAP Commerce Cloud and SAP Composable Storefront. From cloud migration to performance tuning and hands-on support, we make sure our clients have the foundation they need to scale securely and effectively.

Trusted by SAP. Focused on You.

We’re proud to be an SAP Gold Partner, and even prouder of the teams behind our success. Our experts are SAP Certified, and our work been recognized with multiple SAP Quality Awards — but what matters most to us is what we help our clients achieve.

Ready to explore what SAP can really do for your business?

Let’s have a conversation. We’d be happy to show you what’s possible.

Reach out to us!

OTT Video Streaming Platform Migration – Insights & Challenges

The environment

In the ever-evolving landscape of video streaming, growth often demands a transition to more powerful and flexible platforms. Recently, we faced such a challenge with a client who had outgrown their current platform hosted on Vimeo. The task at hand was not just a migration – it was a strategic move to a dedicated, robust, Over-the-Top (OTT) platform tailored to meet the growing demands of their audience.

The steps

  • Content Migration
  • User Data Migration
  • Integration with 3rd Party Platforms
  • Payment System Transition
  • Switch Over Planning

Content migration: A herculean task

Migrating content is the most visible and critical part of the transition. This isn’t just about moving video files—it’s about transferring the entire ecosystem: video assets, metadata, subtitles, artwork, and more. The complexity and volume of data made this task more time-consuming than anticipated, highlighting the need for meticulous planning and execution in content migration.

User data migration: Maintaining continuity

User data migration is a sensitive and intricate process. Our task was seamlessly moving user accounts, profiles, settings, watch history, and recommendations. The challenge lay in mapping the legacy data architecture and transferring relevant data to the new system. A crucial aspect of this was handling passwords, which had to be transmitted in hashed form to maintain security and user trust.

Integration with 3rd party platforms

The new OTT platform demanded the integration of fresh APIs, SDKs, and infrastructure. This step was crucial to support the enhanced features and functionality we envisioned for the platform. Adapting to these new technical requirements was a significant hurdle but essential for the long-term scalability and flexibility of the platform.

Payment system transition: InPlayer integration

A subtle yet vital aspect of our migration strategy was the integration of InPlayer for payment processing. This switch was a technical update and a strategic move to enhance user experience and streamline revenue generation.

Switch over planning: The final leap

The final transition from the old to the new platform required careful orchestration. Our goal was to minimize subscriber impact and avoid confusion during the switchover. This phase demanded precise internal communication among our teams and externally with our client’s audience.

Conclusion

This migration project was more than just a technical task; it was a natural learning curve for us. Tackling these challenges sharpened our ability to handle complex OTT platform transitions. We’ve gained many practical insights from this experience, which we’re excited to apply in our future work.

Looking back, we see this project as more than just a platform switch. It was a significant change for both our client and our team. We’re pleased to have managed this complicated process successfully, and it’s given us a new level of confidence in our work in video streaming development.

The Great Divide: Model-centric vs. Data-centric approach

The bread and butter of machine learning (ML) are data and models. As Data Science academic research and competitions focus mostly on improving the ML models and algorithms, in many aspects the data remain overlooked. This creates an artificial division between the data and model in the ML system that starts to frame two separate approaches towards AI – Model-centric and Data-centric.

The benefits of excellent models


A famous quote often attributed to the statistician George Box says that all models are wrong but some are useful. By extension, some models are extremely useful, and some are, let’s face it, useless. To build a good ML solution, you need a model that captures the underlying dependencies in the data, filtering out the idiosyncratic noise and performing well on new, unseen data.

A model improvement can be achieved in various ways. While there are many common recipes and tools for model optimization, for many applications, the modelling work remains affined to the artwork. The usual workflow includes:

  • Testing various model architectures and specifications, different objective functions and optimization techniques;
  • Fine-tuning the hyper-parameters defining the model structure and the model-training process.

What is referred to as a model-centric approach is an activity of dedicating time and resources to reiterating the model. The goal is to improve the accuracy of the ML solution while keeping the training data set fixed.

The more one approaches the realistic limits for model performance, the smaller the room for model improvements becomes and the marginal return on spending time and resources on the task starts to diminish. All this doesn’t say that one has reached the potential for the whole ML solution. There might still be vast room for improvement available.

The benefits of high-quality data


Once you see that you reach the potential of your model on the given dataset, the usual go-to is the universal “get more training data.” This might often be all you need to reach the performance goals of your model. Sometimes though, what you need is not more data, but better data.

The data-centric approach is concerned with how to improve the overall performance of the ML solution by focusing on the quality and sufficiency of the data while keeping the model training part fixed. What the Data-centric approach suggests is not something novel or revolutionary but a reminder that actually no model can be better than the data it was trained on and that improvements in the quality of the data can lead to much higher performance gains for the overall ML solution.

Data consistency, data coverage, label consistency, feedback timeliness and thoroughness, and model metadata are some of the aspects of the data that can improve your ML solution.

  • Consistent data is data, anything else is confusion and ambiguity. Are the ETL (extract, transform and load) pipelines providing you with the clean and systematic data necessary for your ML applications? If the answer is no, then perhaps a greater effort is required to improve upon the relevant processes.
  • The data coverage asks whether the sample you are training your model on is representative of the population your model is going to be used on. If some subpopulations or classes are underrepresented, evaluate what might be the effect of this and, if needed, think about how to overcome this. Often data filtering, rebalancing, or data augmentation might help. Another aspect of the coverage is the content. Are all characteristics relevant for the discrimination between the observations present in your dataset, do you need and can you get additional features for your ML task?
  • Labels consistency – this one is a huge issue for any supervised ML task. From the correct definition of the labels for your ML task to the accurate labelling of the dataset: all aspects can hugely affect the outcome of the model training. There are multiple strategies and techniques that can be useful for improving the labels in your project and it is always a good idea to spend some time checking the quality of your labels manually – even on a very small subset of the data.
  • Monitoring data – once deployed to production, the ML system is not done. Model performance will inevitably deteriorate due to data or concepts drifts. Setting up good monitoring for your model is the first line of defence against such a trend. Often one cannot foresee in which aspect the input data for the model may shift or how the performance of the model may decrease and setting up monitoring on a wider range of indicators and subpopulations may reveal underlying changes faster.
  • Model Metadata – the high quality of an ML system is also akin to transparency and reproducibility. Model performance metrics and means for reproducibility can generally be called model metadata and are also important for easing the work on model experimentation and optimization.

Business and analytic tradeoffs


How to strike the right balance between improving your code and improving the quality of your data? You can – as with any other decision – put some data into use.

Analyze your processes and see what is the ratio of the time spent working on data vs the time spent working on the code for improvement of the accuracy of the ML applications. Time-box the model optimization part, put the model in production when you reach satisfactory results, and start collecting feedback for gaining insight into your model and improving your data set. Prioritize high-quality data throughout all phases of the ML project for the MLOps team.

It might be worth reconsidering also the composition of your ML teams. How many data engineers and analysts vs ML engineers and modellers do you have?

This can be generalized further at an organizational level for any decision concerning your data assets and ML projects. Build and maintain better data infrastructure instead of investing in more ML projects. And consider how better data quality and infrastructure can improve the profitability of the undertaken ML projects.

Where to go from here?


Starting from the investigation phase of the project, spend some time on what would be the upper feasible limit on the performance of the model that is going to be built. If this is a frequently occurring ML task, one can check the literature for what is the level already achieved by other Data Scientists. Alternatively, take a small sample and measure the human-level performance on it. This can be used as a guideline for the feasible model performance regarding the task at hand.

Once realistic benchmarks for the output of your ML project are set up front and the first model prototype is ready, carefully analyze what is missing to get to this benchmark. A quick analysis of the errors of your model, evaluating some human-level performance benchmarks, and digging into the potential gaps can guide you on whether it’s worth to continue training and optimizing your model or whether it’s better to spend more time on collecting additional data, better labelling or feature creation. Iterate.

What will help in moving through these phases effectively is a data-centric infrastructure for the ML solution. What you need here is an automated retraining and deployment process and integrated model monitoring that can quickly bring the feedback for your model and the new training data increments to trigger model retraining or reworking. For this purpose, the project requires a developed MLOps infrastructure providing timely and consistent, high-quality data for your system. Tools and expertise for building full MLOps pipelines are quickly piling up to meet the new requirements and demand in the field of production ML.

Prioritize data quality over data quantity. Prioritizing tasks on creating and maintaining systematic, high-quality data for your business would unlock the potential for better analytics and better ML solutions for your organization. Instead of investing in creating models for the multiple use cases, you want to address, put your data in the centre of your decision-making and build the data infrastructure that would allow you to create cutting-edge ML solutions to reach the quality necessary to make the ML investment profitable and protect your solutions from potentially hard to fix or costly deteriorations in performance.

And know that you are not alone in this. Andrew Ng is on a quest for higher data awareness and more and more useful content on the topic can be found on Data-Centric AI Resource Hub.

The data should show the way


The data-centric approach isn’t anything new. The applied Data Scientists and ML practitioners would always know that the data is the guiding light, the main ingredient for their recipes. What the data-centric approach emphasizes is that the marginal product of data-quality-related activities in many applications might be higher than in the model-related investment.

Let your data show you the way and allow a gradual shift from a model-centric to a data-centric mindset to help you rethink how ML projects are formulated and implemented.

Do you need a partner in navigating through times of change?


At Wiser, we specialize in delivering success and will be happy to accompany you through your data science and analytics journey, all the way into the stratosphere. Learn all you need to know about data science or just book a consultation with our team of experts to start your data science journey efficiently, with the right team on your side.

How to achieve broadcast-grade quality in PPV streaming at scale?

Introduction

In the ever-evolving realm of digital streaming, delivering pay-per-view (PPV) content that matches the caliber of traditional broadcast has remained a formidable challenge. Wiser Technology, a leading software service company, stands at the forefront of this challenge, striving to provide millions of global fans with a seamless, high-quality live-streaming experience. This case study aims to succinctly convey the technical prowess and successful strategies employed by Wiser Technology, in partnership with JW Player, to deliver exceptional PPV live-streaming services.

Together, we will dive into how Wiser Technology, in collaboration with JW Player (JWP), successfully achieves broadcast-grade quality in PPV streaming at scale.

The Challenge

The PPV streaming landscape is complex, mainly when providing a broadcast-grade consumer experience at scale without delay. The goal is to course content and replicate the real-life experience of watching a live broadcast.

The Solution

Our quest for optimal technology led us to JW Platform, a robust partner supporting all our infrastructure needs. JW Platform is pivotal in delivering a top-tier web experience, integrating seamlessly with payment systems, entitlements, and more.

JW Player’s Role

JW Player plays an instrumental role in this ecosystem, offering:

  • World’s Fastest HTML5 Player: Ensuring broad reach, enhanced engagement, and effective monetization.
  • Customizable Experience: Tailoring the video player to create fully branded experiences.
  • Broadcast Quality Stability: Ensuring consistent, high-quality HLS DASH-compliant video playback.
  • Complete API Control: Offering comprehensive control over every aspect of the video experience.

Secure Video Streaming

In the digital age, content security is not just a feature – it’s a necessity, especially for pay-per-view events where exclusivity and revenue protection are paramount. Wiser Technology takes this aspect seriously, employing state-of-the-art Digital Rights Management (DRM) to safeguard video streams.

Wiser Technology’s streaming solution supports a variety of DRM schemas, ensuring compatibility and security across different devices and platforms. These include:

FairPlay

Used primarily in Apple environments, FairPlay is a DRM technology developed by Apple Inc. It’s widely used for streaming content on iOS devices, Apple TV, and Safari on macOS.

PlayReady

Developed by Microsoft, PlayReady is a versatile DRM solution that supports a wide range of business models, including rental, subscription, and electronic sell-through, and is widely used on Windows devices and some smart TVs.

Widevine

Google’s Widevine technology provides multiplatform DRM and supports a range of standards. It’s vital for streaming on Android devices and Chrome browsers and is also used on many smart TVs.

The Result

Our collaboration has led to the seamless streaming of some of the most significant live PPV boxing events, handling peak loads with minimal latency (<5sec), and offloading 95% of traffic to CDNs. Our infrastructure has successfully managed 1,200+ live PPV events annually, with 16K+ requests per second at peak times.

Conclusion

Our journey with JW Player illustrates that achieving broadcast-grade quality in PPV streaming at scale is a goal and a reality. With robust platforms, effective caching, redundancy, auto-scaling, and strategic CDN use, we ensure a smooth and flawless digital adventure for every viewer. At Wiser Technology, we are dedicated to supporting every phase of the content distribution life cycle, collaborating closely with JWP to build and customize OTT services that meet the unique needs of our customers.