New level of Windows optimization
About two years ago, our team told about its project to optimize Windows using neural network technology.
Over the past two years, much has changed, but the most important thing has remained unchanged. And in this article we want to share those discoveries and conclusions that have been made over these two years.
If you are interested in the situation with independent production of PC software in 2020, welcome.
But before moving on to the main point, it is necessary to clarify that everything said below is described from the perspective of an independent company, which in the process of development moves forward only at the expense of internal resources.
What has been done in these two years
Publishing in the MS Store has become one of the most difficult, but also one of the most important achievements when working on a project. It was this step that allowed us to go beyond the CIS market and develop the global market for Windows software.
The next step, after it became possible to manage CPU priorities and CPU cores to optimize Windows, it was necessary to teach the application to manage all other key components of a PC in order to use neural network algorithms with maximum efficiency.
And if everything was relatively simple with managing the kernels and priorities of the CPU, then to manage the priorities of RAM and the priorities of I / O, I had to dive into Windows architecture so much that it was time to write dissertations. For example, from all the software we know, only WPS and Process Lasso can set “high” I / O priority.
Well, as soon as the full control of the CPU, RAM and ROM was mastered, came the time of the peripheral devices. Therefore, the next development step was power management, where WPS was the only application we know that manages power dynamically rather than statically. This means that for maximum performance, you do not need to constantly keep the CPU frequency at 100% and do not allow the cores to park, which leads to constant operation at maximum loads and increased power consumption. Now, by analyzing applications with a neural network, it became possible to get the same maximum performance, but just when the user is working with heavy software or playing demanding games so that the rest of the time the computer does not translate electricity into heat without any benefit.
And in addition to optimizing the key elements of the PC, data transfer optimization was added (through traffic compression and blocking ads on the VPN), as well as automatic cleaning of junk files.
Over the past time, we have repeatedly seen the correct choice of application architecture. Due to the fact that all operations with the system are performed through WinAPI, the issue of stability (system and software) when making changes to the operating parameters turned out to be solved almost perfectly. For more than two years of the application and more than a million devices on which it was installed, we have not registered a single case of problems with the stability of the system or incompatibility with third-party applications.
UI / UX
One of the key areas of work over the past time has been the improvement of UI / UX, which allowed us to lower the threshold for new users to master key application functions.
Hardware Performance Rating
There was an opportunity not only to say how many conditional parrots this or that PC gives out, but to evaluate the actual compliance of the performance of an individual computer with respect to all actual PCs on which the application is installed. Such a function solves three problems at once. The first is the question of which component of the computer is the most outdated and needs to be replaced. The second is how much this computer is faster / slower than all others in general. Third, embedded algorithms can use various approaches to increase the performance of powerful and weak PCs to achieve the best results.
Sales are just as important as production, one cannot exist without the other, therefore both of these areas in software development require the same attention and investment.
Feedback from the end customer solves everything, no matter how outstanding your technological solutions are, they are useless, if the client does not need them, he does not know how to use them or they are corny unattractive.
Any decisions need to be checked and double-checked, theory and practice diverge regardless of the amount of your expertise, so you need to check any change in the focus group to make sure your decisions are correct.
Stability is the key to success. This is true at all levels, both strategic in the behavior of the company and people, as well as in applied to the product.