Wiser Technology Wins AI Adoption Leader Award at SEE ITS Summit 2025

Sofia, Bulgaria, October 3, 2025 – Wiser Technology (BUL: WISR) has won the AI Adoption Leader award at the SEE ITS Awards 2025, presented on October 2 as part of the Southeast Europe Innovation, Technology and Sourcing Summit at the Grand Hotel Millennium Sofia.

The award recognizes the Wiser AI Factory, the company’s proprietary platform for accelerating enterprise AI transformation. It highlights Wiser’s role in advancing AI adoption at a time when a recent MIT report found that 95% of GenAI pilots are failing. This achievement reinforces Wiser’s long-term strategy of combining engineering depth with scalable, platform-enabled growth.

The AI Factory marks Wiser’s evolution beyond traditional software services toward a consulting and platform-enabled model for enterprise AI adoption. It delivers proof of concepts running on real data in under a month and customized MVPs (Minimum Viable Product), in production in under three months, creating a repeatable engine for scaling across markets while strengthening competitiveness and resilience. The impact is already visible with AI Factory-based solutions cutting manual workloads by up to 90%, accelerating board-level decision-making, and building secure, internationally compliant cloud architectures.

AI is both a major challenge and a significant opportunity for businesses. With the Wiser AI Factory, we help enterprises overcome the complexity of adoption while turning this technology into a driver of growth and competitive advantage. This model enables us to create long-term value and reinforce Wiser’s position as a resilient and fast-growing technology company.

Kristian Smilenov, AI Factory Co-creator and SVP, Strategy & Business Development at Wiser Technology.

The recognition comes at an essential stage in Wiser Technology’s development, underscoring the strategic role of its AI-driven model in sustaining growth, strengthening market position, and supporting investor confidence.

The SEE ITS Awards, organized by AIBEST, bring together regional leaders in technology and business services. The AI Adoption Leader Award honors companies that have strategically integrated artificial intelligence into their operations, products, or services to deliver measurable performance gains.

Wiser Technology Powers Growth with Vertically Aligned Go-To-Market Approach

Sofia, Bulgaria, June 20, 2025 – Wiser Technology (BUL:WISR) today announced the completion of a significant strategic shift with the launch of five Business Units, indicating the final stage of the consolidation and transformation process that began with its expansion through M&A in 2023.

As digital and AI transformation projects grow in scope and complexity, clients are no longer seeking one-off vendors. They want committed and capable partners who understand their business, context, and industry – partners who grasp the stakes, navigate ambiguity, and deliver lasting results. Wiser’s new Business Unit structure reflects that shift.

This change marks Wiser’s transition from a functionally structured organization to a vertically and regionally aligned model. It reflects the company’s commitment to delivering greater value by combining deep technological expertise with industry-specific knowledge and clients’ market insight. Each Business Unit will operate with its own strategic leadership, P&L ownership, and tailored business plan, positioning Wiser to create sustained impact across key sectors and growth regions.

Effective immediately, the company will operate through five Business Units:

  • Automotive, Aerospace & Defense, led by Dimitar Dimitrov, will deliver intelligent edge and cloud systems, along with advanced mission-critical software solutions tailored for highly regulated industries.
  • Financial Services, led by Alexander Papratilov, will support financial institutions with digital transformation, secure & compliant solutions, and AI-driven innovation.
  • Technology, Media & Telecommunications, led by Emil Galabov, will partner with IT companies, media, and telecom providers on platform transformation, service offerings, and AI adoption.
  • Middle East, led by Bojidar Bakalov, is structured as a geographic unit, reflecting the region’s unique dynamics, where large-scale AI transformation programs often span multiple industries.
  • Emerging Verticals, led by Jovan Kocić, focuses on additional industry verticals such as retail and healthcare, where Wiser already has experience and sees strong potential for growth.

Wiser’s new go-to-market model includes а cross-industry Strategic Partnerships Organization, led by Spartak Kabakchiev, and a cross-industry Marketing Organization, led by Slavena Tisheva.

“This new approach allows us to deliver greater value to our clients by aligning more closely with their industry-specific needs and business context,” said Kosta Jordanov, CEO of Wiser Technology. “It enables us to take a proactive, custom approach to each market segment we serve, providing tailored solutions that drive measurable impact. Our top entrepreneurial business talent is now focused on driving growth.”

It’s not just about creating focus, it’s about embedding leadership into the mission. Each unit is empowered to shape its own go-to-market strategy, evolve with its customers, and take ownership of long-term value creation. This structural pivot enables speed, accountability, and trust at the edge of the business, where it matters most.

Internally, the new model moves Wiser from a generic go-to-market approach to a vertically focused one, combining deep industry expertise with horizontal capabilities in strategy, design, engineering, and enablement. Each Business Unit is powered by Wiser’s full-stack delivery model, which brings together strategic advisory, experience design, engineering, and enablement services. This integrated approach allows the company to support clients across the full lifecycle – from initial problem definition to long-term operational success.

With this structure in place, Wiser is positioned to scale with purpose, bringing vertical depth, delivery excellence, and long-term accountability to every client relationship. This marks the start of a new chapter defined by industry and regional focus, and a relentless commitment to delivering meaningful outcomes for customers and generating value for shareholders.

Wiser Technology Wins Two New European Defence Fund Projects

Sofia, Bulgaria, May 8th, 2025 – Wiser Technology (BUL:WISR) partners in two winning European Defence Fund (EDF) consortia selected under the 2024 call. Both projects are 100% financed by the EDF and have a total budget exceeding €33 million. ASTERION is focused on delivering a universal system enabling secure and adaptable underwater communication between and within devices. MARTINA is dedicated to creating a framework to test the accuracy and reliability of AI for satellite image analysis.

ASTERION (Adaptive and Secure Technology-Enabling Reliable and Integrated Opto-acoustic underwater Networking) is a 36‑month action coordinated by the Netherlands Organisation for Applied Scientific Research (TNO), bringing together 19 organizations – nearly half of them research institutes. The project aims to design a flexible and secure underwater communication system that seamlessly connects devices and networks, enhancing interoperability in maritime defense operations.

Wiser contributes specialized algorithms and protocol expertise in ultrasonics for underwater communication. The team has previously delivered system‑of‑systems requirements, detailed software, and hardware design for the CUIIS underwater‑technologies project financed under the European Defence Industrial Development Programme (EDIDP). According to Toshko Punchev, Head of Defense Solutions at Wiser Technology, the company was recommended by another consortium partner – one of the world’s largest shipbuilders – because of its valuable combination of deep military standards know‑how and hands‑on applied expertise demonstrated in the CUIIS project. 

The ASTERION consortium has to develop a versatile underwater communication system that uses acoustic, optical, electromagnetic, and above-water channels (radio and internet) to connect devices both below and above water, supporting various data speeds and formats. “It sounds straightforward on paper, yet the variability of seawater causes refraction and interference that must be detected, filtered, and mitigated,” Punchev explained. “ASTERION will be a demanding project, but our team has the discipline and experience to deliver it.” 

Running in parallel, the 48‑month MARTINA (Multi-source satellite imagery ARTificial INtelligence Analysis challenge) action aims to create a standardized framework to evaluate how effectively AI analyzes diverse satellite images for defense purposes, ensuring accuracy and reliability. The project is coordinated by the Royal Netherlands Aerospace Centre (NLR) and includes partners like Fraunhofer, DLR, ICEYE, SatCen, Hisdesat, and the Luxembourg Institute of Science and Technology. 

Wiser Technology will be responsible for developing the software infrastructure for the consortium, including user identity and Access Management portal for participant interaction and evaluation, as well as a robust front end for intuitive access and a scalable back end to support data handling, workflow management, and result submission. In addition, the company will contribute its expertise in analyzing road infrastructure through dataset preparation, evaluation frameworks, and AI lifecycle development, all building on the company’s prior European Space Agency (ESA) work in AI-driven image analytics for critical infrastructure analysis and road safety.

The MARTINA project is an excellent example of how Wiser Technology creates synergy from its ESA experience and commercial expertise in building state-of-the-art solutions for the European defense needs.

Ventsislav Neykov, VP Technology & Innovation at Wiser Technology

These new assignments continue Wiser’s growing track record in European defense innovation, following over a dozen EDIDP, EDF, and NATO projects. Across all of them, Wiser applies established military standards for software development throughout its full lifecycle—processes, milestones, and documentation—and maintains specialized licenses for requirements traceability and design modelling. This disciplined toolchain delivers the unified engineering approach that European ministries of defense increasingly require.

About European Defence Fund

EDF Official Website

The European Defence Fund (EDF) is a flagship initiative by the European Union aimed at enhancing defense collaboration among Member States. With a budget of nearly €8 billion for the 2021–2027 period, the EDF supports joint research and development projects that foster innovation and improve interoperability in defense technologies and equipment. By providing financial assistance throughout the entire lifecycle of defense products—from research and design to prototyping, testing, and certification—the fund seeks to strengthen the EU’s strategic autonomy and bolster the competitiveness of its defense industry.

Private Social Network for Plight

Client at a glance

50k+Active users
5 yearsEstablished in 2020
2+ MillionPosts monthly

Building a scalable platform

Plight aimed to develop a social platform that would allow users to engage in meaningful discussions, explore diverse perspectives, and receive objective insights on trending and important topics. The platform had to be feature-rich, scalable, and deliverable within tight deadlines, which required integrating various external services for content moderation, KYC, and media distribution.

Challenge

  • Tight Deadlines: The need to rapidly develop the core features of the platform, while leaving scalability for a later stage.
  • External Dependencies: The platform required integration with external services for content moderation, KYC (Know Your Customer), and media distribution to meet the deadlines.
  • Scalability: Ensuring the platform could grow effectively without compromising performance or user experience.

Our Approach

We created a system using a microservices architecture, allowing for modularity and scalability.

Key actions included:

  • Microservices Architecture: Implemented a modular design to ensure scalability and maintainability as the platform grew.
  • Complex Payment System: Developed a tiered model for payments in credits and fiat currency, with Stripe integrated for payment handling and an internal ledger to track transactions.
  • Content Distribution: Integrated a Content Delivery Network (CDN) for efficient media content distribution across the platform.
  • AI-Based Moderation: Leveraged AI services for automatic content moderation, including tagging of text, images, and videos to maintain quality and safety.
  • Security and Verification: Deployed antibotting services for content protection, along with a KYC service for verifying content creators through document and selfie authentication.

Impact delivered:

  • Fully Functional Social Platform: Delivered a feature-rich platform that allows users to subscribe to content and engage in thoughtful discussions.
  • Comprehensive Administration: Enabled administration of all platform content and business logic from a separate portal for ease of management.
  • AI-Based Content Augmentation Plugin: Developed a plugin capable of representing relationships between political entities, enhancing the platform’s value.
  • Integrated Services: Successfully integrated CDN, voice and chat services, KYC, and antibotting features to increase platform value and user safety.

NLP for Superior User Interaction

Client at a glance

$130M+Raised in venture capital
3,000+Customers globally
$10B+Managed outcomes

Advancing User Experience with AI-Powered Insights

Quantive (formerly Gtmhub) is a leading provider of strategy execution software and services, built on the Objectives and Key Results (OKRs) methodology. Their platform helps organizations achieve alignment, enhance visibility, and foster a results-driven culture. Quantive’s platform now offers an enhanced user experience powered by real-time semantic suggestions and predictive analytics. By combining cutting-edge NLP technologies with scalable infrastructure, we empowered Quantive to deliver smarter, faster, and more intuitive interactions that drive customer success.

Challenge

As digitalization drives innovation at unprecedented speeds, Quantive faced the challenge of optimizing its product to offer personalized and seamless user experiences. Specifically, they sought to:

  • Deliver customized recommendations for OKR creation and management, alleviating the complexity of self-service workflows.
  • Streamline navigation to Insights, ensuring users could quickly access the most relevant information.
  • Boost customer satisfaction by automating form fill-ins and improving usability.

Our Approach

To address these needs, we collaborated with Quantive to deliver a data-driven solution that leverages advanced machine learning and natural language processing (NLP) technologies.

Key actions included:

  1. Data Infrastructure:
    • Developed a PostgreSQL database connected to an automated pipeline in Azure Cloud.
    • Integrated Azure Data Lake and Data Factory for sourcing data from MongoDB, later orchestrated through Azure Synapse for scalability.
  2. NLP Integration:
    • Built an LSTM-based neural network using TensorFlow for automatic form fill-ins.
    • Deployed a pre-trained Transformer model to generate real-time semantic recommendations for Insights, integrated via gRPC microservices fetching data from a Kafka stream.
  3. Operationalization of ML Pipelines:
    • Automated data science workflows using MLflow, Jenkins, and SonarQube for model training, testing, and deployment.
    • Set up Grafana dashboards to monitor real-time service accuracy and performance, ensuring ongoing reliability.

Impact Delivered

  • Precision in Automation: Achieved over 95% accuracy in automatic entity recognition and form completion.
  • Enhanced Engagement: Significantly boosted customer engagement by providing timely, relevant recommendations.
  • Increased Usability: Improved adoption of OKRs and Insights features by streamlining workflows and enhancing user satisfaction.
  • Performance Excellence: Delivered a model service with a mean response time of 150 milliseconds per request, seamlessly handling thousands of customer interactions daily.

Expertise and Scope

  • Deliverables: NLP-based recommendation system, automated data pipelines, monitoring dashboards
  • Technology Stack: Python, TensorFlow, Keras, MLflow, Docker, Jenkins, SonarQube, Azure Data Lake, Azure Data Factory, Azure Synapse, PostgreSQL

Enterprise Transformation with Advanced Software Solutions for BOSCH

Client at a Glance

$91BAnnual revenue
400,000Employees worldwide
60Countries of operation

Pioneering Digital Transformation for a Global Leader

By leveraging Wiser’s expertise, Bosch accelerated its digital transformation, fostering innovation across its key business units. This collaboration empowered Bosch to stay ahead of the competition, harness emerging technologies, and maintain its position as a global industrial leader.

Challenge

As a global leader in industrial engineering, Bosch faced the dual challenge of transforming into a digital-first organization to meet evolving consumer demands while staying competitive against industrial giants from the US and Asia. At the same time, Bosch sought to leverage its vast data assets to explore emerging AI/ML technologies. To maintain its leadership position, Bosch needed external expertise to complement its in-house engineering teams and accelerate innovation across multiple domains.

Our Approach

Since 2021, Wiser has partnered with Bosch Digital (formerly Bosch.IO), the IT subsidiary of the Bosch Group, to support its transformation journey. Collaborating closely with Bosch’s engineering teams, Wiser contributed talent and expertise across a range of cutting-edge projects, including:

  • Manufacturing at Scale: Developed digital tools using Java and Angular to optimize industrial manufacturing processes.
  • AI/ML Research: Conducted research and Proof-of-Concept (PoC) projects involving knowledge graphs to enhance data insights and decision-making.
  • Mobile Development: Built Android applications to support Bosch’s digital ecosystem.
  • Anti-Counterfeit Solutions: Delivered digital solutions to secure physical goods against counterfeit risks.
  • Cloud Application Development: Implemented an open-source cloud stack for scalable application development.

Our teams worked in a hybrid collaboration model, combining remote and onsite work at Bosch and Wiser offices. Depending on project needs, Wiser’s specialists traveled to Stuttgart or other Bosch locations for workshops, technical conferences, and strategy sessions.

Impact Delivered

  • Enhanced Manufacturing Efficiency: Improved manufacturing at scale with digital tools, driving operational excellence.
  • Innovation with AI/ML: Delivered PoC projects using advanced AI/ML techniques, positioning Bosch for future technological leadership.
  • Optimized Customer Experience: Digitized customer-facing applications, enriching user engagement and satisfaction.
  • Scalable Development Solutions: Provided Bosch with robust cloud-based tools to enable seamless application development.

Expertise and Scope

  • Tech Stack: Java, Python, PHP, DevOps tools, Angular
  • Focus Areas: Manufacturing optimization, AI/ML research, Android app development, digital anti-counterfeit solutions

IoT-Connected App for Dams Monitoring

Client at a glance

#1in green recovery and tech development in Bulgaria
since 2012Pioneer in IoT solutions in the region

Revolutionizing Infrastructure Monitoring with IoT Solutions

By delivering a robust and scalable IoT platform, Wiser empowered Sentra Systems to transform dam monitoring for their enterprise customer. The solution not only enhanced real-time situational awareness but also ensured infrastructure safety, supporting proactive decision-making and contributing to sustainable resource management.

Challenge

Sentra Systems, a leader in IoT solutions, aimed to develop a state-of-the-art monitoring system to replace a legacy platform used for dam and reservoir oversight. Their enterprise customer required a scalable, user-friendly solution capable of 24/7 on-site monitoring and seamless data analysis to address critical risks like flooding and drought. Additionally, the new system had to support future use cases, such as municipal water supply and smart city applications.

Our Approach

Partnering with Sentra Systems, we developed a robust IoT-connected application tailored to their customer’s needs.

Key actions included:

  1. Discovery Phase:
    • Defined the project scope and business needs, ensuring a clear understanding of challenges and unique requirements.
    • Created an all-in-one repository of requirements and success criteria to guide development.
    • Selected the optimal technologies and designed the system architecture for scalability and long-term viability.
  2. Feature Development:
    • Introduced a modern user interface with live data feeds, photo capturing capabilities, and 3D visualization for over 100 sites with 2,000+ sensors.
    • Developed customizable dashboards for sensor data, offering users flexibility in monitoring and reporting.
    • Designed live alert systems for extreme events, such as high water levels and risks of overflow or drought.
  3. Collaborative Execution:
    • Worked closely with Sentra’s engineering team, incorporating feedback through mini-demos after each development phase.
    • Connected the front-end and back-end in December 2020, enabling seamless data flow and real-time monitoring.

Impact Delivered

  • Scalable Monitoring System: Monitors 100+ sites with over 2,000 sensors, analyzing 28,000+ data points per minute.
  • Enhanced User Experience: Delivered a mobile-friendly, user-centric interface tailored for infrastructure technicians on the go.
  • Improved Risk Management: Enabled live alerts for extreme events, helping authorities anticipate and respond to floods, droughts, and other risks.
  • Future-Ready Solution: Designed with potential applications for parking lot monitoring and municipal water supply management.

Expertise and Scope

  • Deliverables: IoT-connected app with real-time monitoring, customizable dashboards, and 3D visualization
  • Technology Stack: Vue.js, Python (Django), HTML (Jinja), jQuery, CSS3 (Bootstrap, Bootstrap Vue), Highcharts.js, Docker, Jenkins, Git (GitLab, Git Flow), Linux, MQTT, MySQL, Mosquito, Web Sockets

The Great Divide: Model-centric vs. Data-centric approach

The bread and butter of machine learning (ML) are data and models. As Data Science academic research and competitions focus mostly on improving the ML models and algorithms, in many aspects the data remain overlooked. This creates an artificial division between the data and model in the ML system that starts to frame two separate approaches towards AI – Model-centric and Data-centric.

The benefits of excellent models


A famous quote often attributed to the statistician George Box says that all models are wrong but some are useful. By extension, some models are extremely useful, and some are, let’s face it, useless. To build a good ML solution, you need a model that captures the underlying dependencies in the data, filtering out the idiosyncratic noise and performing well on new, unseen data.

A model improvement can be achieved in various ways. While there are many common recipes and tools for model optimization, for many applications, the modelling work remains affined to the artwork. The usual workflow includes:

  • Testing various model architectures and specifications, different objective functions and optimization techniques;
  • Fine-tuning the hyper-parameters defining the model structure and the model-training process.

What is referred to as a model-centric approach is an activity of dedicating time and resources to reiterating the model. The goal is to improve the accuracy of the ML solution while keeping the training data set fixed.

The more one approaches the realistic limits for model performance, the smaller the room for model improvements becomes and the marginal return on spending time and resources on the task starts to diminish. All this doesn’t say that one has reached the potential for the whole ML solution. There might still be vast room for improvement available.

The benefits of high-quality data


Once you see that you reach the potential of your model on the given dataset, the usual go-to is the universal “get more training data.” This might often be all you need to reach the performance goals of your model. Sometimes though, what you need is not more data, but better data.

The data-centric approach is concerned with how to improve the overall performance of the ML solution by focusing on the quality and sufficiency of the data while keeping the model training part fixed. What the Data-centric approach suggests is not something novel or revolutionary but a reminder that actually no model can be better than the data it was trained on and that improvements in the quality of the data can lead to much higher performance gains for the overall ML solution.

Data consistency, data coverage, label consistency, feedback timeliness and thoroughness, and model metadata are some of the aspects of the data that can improve your ML solution.

  • Consistent data is data, anything else is confusion and ambiguity. Are the ETL (extract, transform and load) pipelines providing you with the clean and systematic data necessary for your ML applications? If the answer is no, then perhaps a greater effort is required to improve upon the relevant processes.
  • The data coverage asks whether the sample you are training your model on is representative of the population your model is going to be used on. If some subpopulations or classes are underrepresented, evaluate what might be the effect of this and, if needed, think about how to overcome this. Often data filtering, rebalancing, or data augmentation might help. Another aspect of the coverage is the content. Are all characteristics relevant for the discrimination between the observations present in your dataset, do you need and can you get additional features for your ML task?
  • Labels consistency – this one is a huge issue for any supervised ML task. From the correct definition of the labels for your ML task to the accurate labelling of the dataset: all aspects can hugely affect the outcome of the model training. There are multiple strategies and techniques that can be useful for improving the labels in your project and it is always a good idea to spend some time checking the quality of your labels manually – even on a very small subset of the data.
  • Monitoring data – once deployed to production, the ML system is not done. Model performance will inevitably deteriorate due to data or concepts drifts. Setting up good monitoring for your model is the first line of defence against such a trend. Often one cannot foresee in which aspect the input data for the model may shift or how the performance of the model may decrease and setting up monitoring on a wider range of indicators and subpopulations may reveal underlying changes faster.
  • Model Metadata – the high quality of an ML system is also akin to transparency and reproducibility. Model performance metrics and means for reproducibility can generally be called model metadata and are also important for easing the work on model experimentation and optimization.

Business and analytic tradeoffs


How to strike the right balance between improving your code and improving the quality of your data? You can – as with any other decision – put some data into use.

Analyze your processes and see what is the ratio of the time spent working on data vs the time spent working on the code for improvement of the accuracy of the ML applications. Time-box the model optimization part, put the model in production when you reach satisfactory results, and start collecting feedback for gaining insight into your model and improving your data set. Prioritize high-quality data throughout all phases of the ML project for the MLOps team.

It might be worth reconsidering also the composition of your ML teams. How many data engineers and analysts vs ML engineers and modellers do you have?

This can be generalized further at an organizational level for any decision concerning your data assets and ML projects. Build and maintain better data infrastructure instead of investing in more ML projects. And consider how better data quality and infrastructure can improve the profitability of the undertaken ML projects.

Where to go from here?


Starting from the investigation phase of the project, spend some time on what would be the upper feasible limit on the performance of the model that is going to be built. If this is a frequently occurring ML task, one can check the literature for what is the level already achieved by other Data Scientists. Alternatively, take a small sample and measure the human-level performance on it. This can be used as a guideline for the feasible model performance regarding the task at hand.

Once realistic benchmarks for the output of your ML project are set up front and the first model prototype is ready, carefully analyze what is missing to get to this benchmark. A quick analysis of the errors of your model, evaluating some human-level performance benchmarks, and digging into the potential gaps can guide you on whether it’s worth to continue training and optimizing your model or whether it’s better to spend more time on collecting additional data, better labelling or feature creation. Iterate.

What will help in moving through these phases effectively is a data-centric infrastructure for the ML solution. What you need here is an automated retraining and deployment process and integrated model monitoring that can quickly bring the feedback for your model and the new training data increments to trigger model retraining or reworking. For this purpose, the project requires a developed MLOps infrastructure providing timely and consistent, high-quality data for your system. Tools and expertise for building full MLOps pipelines are quickly piling up to meet the new requirements and demand in the field of production ML.

Prioritize data quality over data quantity. Prioritizing tasks on creating and maintaining systematic, high-quality data for your business would unlock the potential for better analytics and better ML solutions for your organization. Instead of investing in creating models for the multiple use cases, you want to address, put your data in the centre of your decision-making and build the data infrastructure that would allow you to create cutting-edge ML solutions to reach the quality necessary to make the ML investment profitable and protect your solutions from potentially hard to fix or costly deteriorations in performance.

And know that you are not alone in this. Andrew Ng is on a quest for higher data awareness and more and more useful content on the topic can be found on Data-Centric AI Resource Hub.

The data should show the way


The data-centric approach isn’t anything new. The applied Data Scientists and ML practitioners would always know that the data is the guiding light, the main ingredient for their recipes. What the data-centric approach emphasizes is that the marginal product of data-quality-related activities in many applications might be higher than in the model-related investment.

Let your data show you the way and allow a gradual shift from a model-centric to a data-centric mindset to help you rethink how ML projects are formulated and implemented.

Do you need a partner in navigating through times of change?


At Wiser, we specialize in delivering success and will be happy to accompany you through your data science and analytics journey, all the way into the stratosphere. Learn all you need to know about data science or just book a consultation with our team of experts to start your data science journey efficiently, with the right team on your side.

Maximizing Productivity and Profitability for MachineMax

Client at a glance

90% +Accuracy in predicting machine needs
100,000+Machine hours tracked annually

Transforming Heavy Machinery Management with IoT and Predictive Analytics

MachineMax’s platform now sets a new standard for equipment management, maximizing machine utilization and profitability while reducing environmental impact. With a modernized codebase, enhanced user experience, and cutting-edge data visualization, MachineMax is positioned as a leader in leveraging IoT and ML to empower the construction and mining industries.

Challenge

MachineMax, a global leader in equipment management, aimed to enhance its MVP platform to meet evolving customer and investor expectations. The challenges included:

  • Codebase Modernization: Incrementally refactor legacy code while releasing new features to align with modern standards without disrupting functionality.
  • Feature Expansion: Extend the product’s capabilities with predictive analytics powered by Machine Learning (ML) to offer actionable insights.
  • User Experience (UX) Enhancements: Redesign the app’s user interface to improve engagement and accessibility.
  • Data Visualization: Represent predictive analytics and operational data clearly and intuitively on the front end.
  • User Engagement: Deliver weekly analytics email reports with detailed insights and key performance indicators.

Our Approach

We partnered with MachineMax to address their needs by modernizing their platform, enhancing UX, and leveraging advanced analytics for a seamless and engaging user experience.

Key actions included:

  1. Codebase Refactoring:
    • Stored new and refactored code in a dedicated Portal folder, isolating it from legacy code to ensure stability.
    • Introduced a new Design Component system based on Material UI, customized for multi-theme support.
  2. User Experience Redesign:
    • Developed multistep forms for intuitive navigation and grouped data presentation.
    • Centralized user actions under a main navigation bar, improving accessibility for tasks like filtering, searching, and downloading reports.
    • Enhanced dashboards with sticky headers, advanced scrolling options, and predefined filters for data aggregation.
  3. Advanced Data Visualization:
    • Used D3.js to deliver profound visualizations for ML predictions, enabling users to explore all data segments effectively.
    • Implemented general statistics using Chart.js, providing popular graph-based representations.
  4. Custom Email Templates:
    • Created visually rich, dynamic email templates with integrated data analytics graphs.
    • Used tools like MJML, Handlebars, and Quickcharts.io to ensure compatibility with popular email clients, including Outlook.

Impact Delivered

  • Enhanced User Engagement: Delivered a redesigned UI with advanced tools for improved user satisfaction and efficiency.
  • Customizable Reporting: Provided users with detailed weekly analytics, fostering actionable insights and better decision-making.
  • Improved Scalability: Enabled smooth integration of new features without disrupting legacy functionality.
  • Optimized Data Visualization: Empowered users to interpret complex data easily through intuitive visualizations.
  • Reliable Communication: Created robust email templates, ensuring seamless delivery and compatibility across platforms.

Expertise and Scope

  • Deliverables: Custom UI component system, predictive analytics visualization, dynamic email templates, and modernized codebase.
  • Technology Stack: React.js, Redux, Go, Python, Google Cloud Platform, SQL, D3.js, Chart.js, MJML, Handlebars