Том 3
Permanent URI for this collection
Browse
Browsing Том 3 by Subject "article"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item Automated Approach for the Importing the New Photo Set to Private Photo Album to Make it More Searchable(2020) Nikulin, Dmytro; Buchko, OlenaThis paper focuses on describing several Informational Retrieval (IR) Multimedia Systems that work primarily with private photo albums and it provides comparison of IR Multimedia Systems in terms of functionality. The Informational Retrieval Multimedia systems, used for comparison are selected as the most known and widely used as well. The Informational Retrieval Multimedia systems compared are Mylio, Google Photo, Apple Photo, Synology Moments and digiKam. Moreover, based on these systems comparison, this paper provides the definition of the end-point user needs. Then it provides the identification and the prioritization (based on the author’s experience and opinion) the issues in private information Retrieval multimedia systems to be addressed by any technical solution. The following issues have been addressed as the most important from the end-point user’s perspective: First of all, it is “Not existing EXIF file metadata”.(It is divided into two sub issues: not existing taken date and GEO data for place it was taken at); as second, the linking a photo to an author person entity (Based on camera brand and model given as metadata); And as the last one, linking the set of photos to the predefined type of event, like birthdays, anniversaries, holidays, trips etc.(Based on tagging the photos in semi-automated way). For each issue, this document has provided the technical approaches and program algorithms (in a pseudo code) as well. Also, the using of relevant tools is provided if applicable. Each tool mentioned is described in a few words and given with a reference to read about. This document has also described the human involvement into the process, if a totally automated process is not possible. As a conclusion this paper describes the key points that were identified during the identification and addressing the issues. The conclusion also gives a proposal for a technical solution (that is required to be used during importing the new photo set into existing IR Multimedia systems) with context diagram that represents the user, its devices with multimedia data, the system to import the data from user multimedia devices and Informational Retrieval multimedia system to keep the whole private multimedia and to search the target photo set. Finally, it gives the definition of workflow steps to describe the main activity flow (with human involvement and software as well) to be implemented in a technical solution in future.Item Distributed Systems Technical Audit(2020) Gorokhovskyi, Kyrylo; Zhylenko, Oleksii; Franchuk, OlehModern enterprise systems consist of many deployment artifacts, thus, microservices architecture is extremely popular now. The use of distributed systems is rapidly growing despite the increased complexity and the difficulty of building. The main reason for such a trend is that the advantages of distributed systems outweigh their disadvantages. Nevertheless, product release into the market is not a final step of software development lifecycle. Next important step is maintenance that continues much longer than development. System failures and delays in finding and fixing a problem can cause huge financial and reputational expenses. In addition, the new features introduced due to changes on the market should take place on time. Prior to releasing a product to market, we would like to know in advance possible technical gaps to understand what we can expect and maybe fix some issues in order to save time and money in future. In other words, we should be able to make a decision, if the product is ready for launch or not, relying on some data. Such analysis is as well necessary when we obtain ownership for an existing product. Technical audit helps to find out technical debt and assess risks related to maintenance and extension of the system. It should be considered as mandatory activity during release preparation and ownership transfer. Well-defined criteria would help to conduct an audit smoothly and find out most of the technical debt. Properly conducted technical audit reduces risks of problems after release but it does not guarantee commercial success of product or absence of problems at all. In this article we will define what distributed systems are, we will review Monolithic, Microservices and Serverless architectures, describe what are quality attributes and what should be taken into account during technical audits. Next, we will deep dive into the technical audit process, specify what aspects of the system must be considered during an audit. Then we will iterate over checklists items in order to provide guidelines based on the best practices in industry which helps to prepare for software system audit.Item Image Shadow Removal Based on Generative Adversarial Networks(2020) Andronik, Vladyslav; Buchko, OlenaAccurate detection of shadows and removal in the image are complicated tasks, as it is difficult to understand whether darkening or gray is the cause of the shadow. This paper proposes an image shadow removal method based on generative adversarial networks. Our approach is trained in unsupervised fashion which means it does not depend on time-consuming data collection and data labeling. This together with training in a single end-to-end framework significantly raises its practical relevance. Taking the existing method for unsupervised image transfer between different domains, we have researched its applicability to the shadow removal problem. Two networks have been used. Тhe first network is used to add shadows in images and the second network for shadow removal. ISTD dataset has been used for evaluation clarity because it has ground truth shadow free images as well as shadow masks. For shadow removal we have used root mean squared error between generated and real shadow free images in LAB color space. Evaluation is divided into region and global where the former is applied to shadow regions while the latter to the whole images. Shadow detection is evaluated with the use of Intersection over Union, also known as the Jaccard index. It is computed between the generated and ground-truth binary shadow masks by dividing the area of overlap by the union of those two. We selected random 100 images for validation purposes. During the experiments multiple hypotheses have been tested. The majority of tests we conducted were about how to use an attention module and where to localize it. Our network produces better results compared to the existing approach in the field. Analysis showed that attention maps obtained from auxiliary classifier encourage the networks to concentrate on more distinctive regions between domains. However, generative adversarial networks demand more accurate and consistent architecture to solve the problem in a more efficient way.Item Investigation of the Relationship Between Software Metrics Measurements and its Maintainability Degree(2020) Hlybovets, Andrii; Shapoval, OleksandrThe goal of this work is to practically apply methods of empirical engineering software, algorithms for data collection and data analysis. The results include software measurement, analysis and selection of direct and indirect metrics for research and identification of dependencies between direct and indirect metrics. Based on the received results, there were built dependencies between software metrics and software expertise properties were selected by individual variation. For measurement results analysis there were used primary statistical analysis, expert estimations, correlation and regression analysis. Expert estimation is the dominant strategy when estimating software development effort. Typically, effort estimates are over-optimistic and there is a strong over-confidence in their accuracy. Primary data analysis is the process of comprehending the data collected to answer research questions or to support or reject research hypotheses that the study was originally designed to evaluate. Correlation analysis gives possibility to make some conclusions about which metrics and expert estimations are much coupled, and which are not. Regression analysis involves both graphical construction and analytical research and gives an ability to make a conclusion about which metrics and expert estimations are the most coupled. Analyzing regression lines for metrics of normal and nonnormal distributions give an ability to identify pairs of ‘metric – expert estimation’. There have been calculated and measured metrics relations for defining relation of such quality attributes as Understandability and Functionality Completeness. Understandability expresses the clarity of the system design. If the system is well designed, new developers are able to understand easily the implementation details and quickly begin contributing to the project. Functionality Completeness refers to the absence of omission errors in the program and database. It is evaluated against a specification of software requirements that define the desired degree of generalization and abstraction. Relationship between metric and expertise includes building direct relationships between the metric and expertise, indirect metrics and expertise. Additionally, it has been determined whether they have common trends of the relationship between those direct metrics and expert estimates, indirect metrics and expert estimates. The practical results of this work can be applied for software measurements to analyze what changes in the code (affecting given metric) will cause increasing or decreasing of what quality attribute.Item Serverless Event-driven Applications Development Tools and Techniques(2020) Morenets, Ihor; Shabinskiy, AntonServerless, a new cloud-based architecture, brings development and deployment flexibility to a new level by significantly decreasing the size of the deployment units. Nevertheless, it still hasn’t been clearly defined for which applications it should be employed and how to use it most effectively, and this is the focus of this research. The study uses Microsoft Azure Functions – one of the popular mature tools – because of its stateful orchestrators – Durable Functions. The tool is used to present and describe four flexible serverless patterns with code examples. The first pattern is HTTP nanoservices. The example demonstrates how flexible can be the Function-asa-Service model, which uses relatively small functions as deployment units. The second usage scenario described is a small logic layer between a few other cloud services. Thanks to its event-driver nature, serverless is well-suited for such tasks as making an action in one service after a specific event from another one. New functions easily integrate with the API from the first example. The third scenario – distributed computing – relies on the ability of Durable Functions to launch a myriad of functions in parallel and then aggregate their results. and distributed computing. A custom MapReduce implementation is presented in this section. The last pattern described in this research significantly simplifies concurrent working with mutable data by implementing the actor model. Durable Entities guarantee that messages are delivered reliably and in order, and also the absence of deadlocks. The results of this work can be used as a practical guide to serverless main concepts and usage scenarios. Main topic of future research was chosen to be the development of a full-fledged serverless application using typical patterns to study the architecture in more depth.Item Statical and Dynamical Software Analysis(2020) Sosnytskyi, Serhii; Glybovets, Mykola; Pechkurova, OlenaThe development of software built with quality has become an important trend and a natural choice in many organisations. Currently, methods of measurement and assessment of software quality, security, trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively. In this article statistical and dynamical software analysis methods, main concepts and techniques families are overviewed. The article has an overview of why combination of several analysis techniques is necessary for software quality and examples how static and dynamical analysis may be introduced in a modern agile software development life cycle. As a summary of techniques for software analysis, represented on Table 1, due to the computability barrier, no technique can provide fully automatic, robust, and complete analyses. Testing sacrifices robustness. Assisted proving is not automatic (even if it is often partly automated, the main proof arguments generally need to be human provided). Model-checking approaches can achieve robustness and completeness only with respect to finite models, and they generally give up completeness when considering programs (the incompleteness is often introduced in the modeling stage). Static analysis gives up completeness (though it may be designed to be precise for large classes of interested programs). Last, bug finding is neither robust nor complete. Another important dimension is scalability. In practice, all approaches have limitations regarding scalability, although these limitations vary depending on the intended applications (e.g., input programs, target properties, and algorithms used). Already implemented code could be analysed in a continuous integration environment by a tool like SonarQube. Properly configured metrics and quality gates provide fast and detailed feedback on incremental changes starting from development machine till highload enterprise production environments. Software analysis helps to improve quality and development speed in Agile development life cycle with reasonable cost.