Bring your software initiatives into reality with cutting-edge technology and unmatched expertise
25+ talented tech experts
With experience across various industries and technologies, working fully remotely
10+ Years of experience
In providing innovative enterprise solutions tailored to your business needs
200+ Happy clients
200+ happy clientsIncluding SMEs and large-scale corporations from financial, green energy, e-Commerce, retail, automotive, and telecom industries
BeSol Grupp OÜ software development company was established in Tallinn, Estonia in 2021 by Oleg Besman. With 10+ years of experience in various industry sectors, we have a clear understanding of managing complex technical processes and can provide efficient software solutions. Our team of 25+ highly qualified programmers operates as a fully distributed remote workforce, ensuring seamless collaboration and exceptional results.
We specialize in providing tailored solutions for businesses, ranging from cloud computing and cybersecurity to data analytics and software development. Let us empower your business with our expertise.
Our services empower the leaders of the renewable energy industry with cutting-edge software tools and data products, enabling flawless development, assessment, and operation of renewable energy projects.
Our tech experts help automotive businesses worldwide optimize operations, enhance connectivity, and enable data-driven decision-making. We specialize in cybersecurity, IoT integration, cloud solutions, and advanced analytics.
We provide tailored software solutions, including accounting systems, payment gateways, budgeting tools, and secure data management. Boost productivity and operational efficiency with our full-cycle IT consulting and software development services.
Custom solutions to enhance efficiency, security, and customer experiences. From point-of-sale systems to inventory management and online platforms, we help streamline processes and drive business growth in the digital era.
We help companies optimize operations, improve supply chain management, track shipments in real-time, better analyze route planning, automate processes, ensure data security, and enable efficient communication.
Comprehensive IT services tailored to the biggest players in the telecom industry. We ensure seamless operation with expertise in network management, software solutions, security systems, cloud computing, and data analytics.
Transform your healthcare business with cutting-edge IT solutions. From streamlined electronic health record management to advanced telemedicine solutions, we offer heightened data security and optimized workflows for a transformative healthcare experience.
We also specialize in software development for manufacturing, offering tailored solutions for equipment in the processing industry. With our services, you can optimize processes, increase productivity, streamline operations, improve quality control, and gain a competitive edge.
We integrate the power of diverse programming languages to build robust and scalable software solutions.
Explore our showcases and discover how we can help your business thrive.
At BeSol Grupp OÜ, we recognize that selecting a service provider can be a challenging task, with many choices available. That's why we strive to stand out by offering exceptional service and results that speak for themselves.
We are growing a team of 25+ highly skilled dedicated engineers. By working remotely, we are able to maintain flexibility and efficiency in our work, ensuring that our clients receive the best possible service.
oleg.besman@besol-grupp.eu
Peterburi tee.46-227 11415 Tallinn
+372 56 06 0079
Full time
95% Remote
Full time
80% Remote
Healthcare information was scattered across different systems, making it difficult for providers to access and share comprehensive patient records. However, we addressed this challenge by developing a centralized health data platform that integrates data from various sources, making it easier for healthcare providers to access and manage information in a streamlined manner. This improved care delivery, research outcomes, and collaboration among stakeholders.
A centralized healthcare data system operates on the Microsoft Azure Cloud, which provides a highly scalable and secure infrastructure that meets the stringent requirements of the healthcare industry. We utilize a wide range of both commonly-used and healthcare-specific services available on Azure, such as Azure API for FHIR and Azure Security and Compliance Blueprint for Healthcare, to ensure seamless integration, interoperability, and adherence to industry regulations and best practices for data security and privacy.
We take an innovative, collaborative, and customizable approach to creating a centralized health data platform that prioritizes user experience. By working closely with our clients, we gain a thorough understanding of their specific needs and objectives, allowing us to tailor the platform accordingly.
We collaborate closely with our clients to understand their unique needs, challenges, and goals. Through in-depth discussions and analysis, we gather comprehensive requirements that form the foundation of the platform.
We prioritize customization, tailoring the platform to align with our client's specific requirements. We ensure scalability, allowing the platform to adapt and expand as the organization's needs evolve over time.
We focus on seamless integration with existing healthcare systems, such as EHRs, laboratory information systems, and other data sources. Our goal is to ensure smooth data exchange and interoperability, eliminating silos and facilitating comprehensive data access.
We emphasize stringent data security measures, implementing robust encryption, access controls, and data governance protocols to safeguard patient information. Compliance with relevant regulations, such as HIPAA, is a key aspect of our approach.
We leverage advanced analytics and data visualization techniques to derive meaningful insights from the centralized health data. This empowers healthcare professionals and researchers to make informed decisions, identify trends, and drive evidence-based practices.
We prioritize user experience and design intuitive interfaces that enable seamless navigation and efficient workflows. Our aim is to ensure that healthcare providers and other users can easily access and utilize the platform's features and functionalities.
We provide ongoing support, maintenance, and updates to ensure the platform's optimal performance and adaptability.
Our client utilizes MS Azure as its infrastructure provider and needs an immense amount of cloud resources/services worth millions of dollars for their projects. These resources are mainly virtual machines, and the cost of resources has continued to increase as time goes on.
Our team reviewed the usage of cloud resources to find ways to improve cost-efficiency. To accomplish this, we performed an analysis of virtual machines. Our first step was to determine the current parameters of each machine, including CPU, memory, disks, and hourly/monthly price. Then, we considered how each virtual machine was being utilized, examining metrics like CPU/memory utilization. From there, we determined the minimum parameters that each machine would need to meet its workload. Finally, we calculated the potential savings by subtracting the NewSizeMonthlyPrice from the CurrentSizeMonthlyPrice, resulting in the value of PotentialSavings.
As part of our analysis, we provide our client with recommendations for optimizing their virtual machines, which they are free to implement as they see fit. Another potential source of savings we identified was virtual machine disks that were not attached to any machines and were essentially unused. We considered these as candidates for deletion. Based on our calculations, the client could potentially save up to a million dollars over a year or two by following our recommendations. Our team accessed the client's resources using Azure CLI and utilized PowerBI for data import, aggregation, transformation, and export.
We are currently working on a service that will automate the calculation of savings by importing client resources into the repository on a schedule. Historical resource data like metrics and usage will be phased out. After importing the resources, the service will analyze virtual machine utilization and determine the optimal size for each VM to achieve maximum savings. These findings will be saved as "recommendations" available for export. The service will track which recommendations were implemented and calculate the actual savings achieved. This will make the optimization process easier for our clients and help them get the most out of their resources.
A client requested a solution migration from AWS to Azure. This involved transferring multiple microservices, database, and storage (from S3 to Azure Blob Storage). We also needed to ensure that the migration was seamless and didn't cause any downtime for dependent services. On top of that, we were faced with the added challenge of migrating the database to a different engine, specifically from Aurora MySQL to SQL Server.
In order to accomplish our objectives, our development team took a methodical approach to the migration process. We divided the migration into multiple stages, starting with adjusting the applications to work with AWS infrastructure. Next, we utilized an Infrastructure As Code (IaC) approach to deploy a new staging infrastructure on AWS. All migration steps were thoroughly tested in the staging environment before moving on to deploying the production environment. However, we made sure not to go live just yet. Once all tests were completed in the production environment, we switched all dependencies from Azure to AWS. We also had to take additional steps to synchronize data that had been modified in the old database during the brief switch-over period.
Our client wanted to rewrite an existing legacy application written in VB.NET into an application based on modern technologies: Cloud Based, Service-Oriented written in the latest C# version. A significant part of the existing functionality is now handled by an external solution, so the new application provides numerous interfaces to seamlessly integrate the new landscape into the existing company infrastructure.
The decision was made to go with Angular as a frontend framework, .NET Core (C#) for backend services, SQL Server for the persistence layer, and Entity Framework as ORM. All services are hosted in Azure Kubernetes Service in a private cluster. First, our team analyzed what functionality has to be re-implemented, considering that there is now a third-party solution, and agreed on integration points. The rest of the business logic was functionally decoupled into several services that would be responsible for each aspect of the business flow. The existing UI of the Desktop application was taken as a basis for creating a mockup of the future UI. Other references for UI mockups were existing company applications. The goal is to have a consistent design across different client applications. Each component of the system (backend services + UI application) was developed in an isolated manner, with its own build and release pipelines and code repositories. The code base that has to be shared was externalized as a nuget package with its own development and delivery lifecycle. Logging is centralized: all logs from all components are collected in Azure Application Insights, we support distributed tracing and follow Open Telemetry standards. We decided to use Azure Application Config for managing service configuration. All components communicate with each other either through the HTTPS protocol or through Kafka message bus. The UI is highly responsive thanks to the real-time updates and usage of server-to-client WebSockets communication.
The application became much more readable and maintainable: code was rewritten in the latest C# language version, following SOLID principles and N-tier and Microservices architecture. Due to externalizing part of business logic to a third-party system, the company has to maintain less business logic. The UI is now much more comfortable to use due to re-using of the Angular component library.
Full time
95% Remote
You will work for a German electrical grid operator, developing a modern SCADA system with the following capabilities:
The system must be capable of operating completely isolated, both from network connectivity and power supply perspective. If the entire region is in a complete blackout, our system must still be able to operate the electrical grid.
While in development, we use Azure Kubernetes Service, Azure DevOps, Azure KeyVault, Azure Storage Account, and some other cloud tools. In production, there is a native vanilla Kubernetes.
We ingest streams of sensor data via Kafka. There is a sophisticated Kafka streams topology, joining and transforming the streams of events.
The infrastructure is provisioned via terraform. The application is deployed via helm charts. CI/CD pipeline runs on Azure DevOps. We intensively use various Kubernetes operators (Kafka, Keycloak, and many more).
Full time
80% Remote
“Our primary focus lies in the development of enterprise solutions catering to logistics, banking, healthcare, automotive, manufacturing, and other industries. We are proud to work with renowned big companies to help them meet the growing demand for digital transformation and stay competitive and innovative in their industries.”
Oleg Besman, CEO