Volume 6 Issue 11
Author's Name: Prithiraj , Roxanna Samuel, Ponmalai , Jinu Joseph
Abstract—Electrocardiogram (ECG) is the analysis of the electrical movement of the heart over a period of time. The detailed information about the condition of the heart is measured by analyzing the ECG signal. Wavelet transform, fast Fourier transform are the different methods to disorganize cardiac disease. The paper elaborates the survey on ECG signal analysis and related study on arrhythmic and non arrhythmic data. Here we discuss the efficient feature extraction process for electrocardiogram, where based on position and priority six best P-QRS-T fragments are studied. This survey examines the the outcome of the system by using various Machine learning classification algorithms for feature extraction and analysis of ECG Signals. Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Artificial Neural Network (ANN) are the most important algorithms used here for this purpose. There are several publicly available data sets which are used for arrhythmia analysis and among them MIT-BIH ECG-ID database is mostly used. The drawbacks and limitations are also discussed here and from there future challenges and concluding remarks can be done.
Author's Name: Vinoth Kumar, Ghariharan, Aswanthkumar
Abstract—The goal is to look for code performance metrics. Reliability is an important aspect of any program that cannot be ignored and difficult to measure. “Program reliability is defined as the probability of running programs without disruption in a specific environment for a specified period of time.” The reliability of the technology differs from the performance of the hardware. Program reliability is difficult because the complexity of the program is high. Different methods can be used to increase system performance, but it is difficult to balance development time, budget, and software quality. But the best way to ensure technology consistency is to build high-quality programs throughout the life cycle of the program. We will discuss software reliability metrics in this paper. Metrics used early on can help detect and correct defects of requirements that will prevent program lifecycle errors later. It also provides consistency quality of the information system database with the help of RStudio, and we can also illustrate reliability based on the value of cyclomatic complexity and we can say whether the data or software is more reliable, less reliable or somewhat reliable.
Author's Name: Fatima Khan, P.R. Ravinder
Abstract— Methods for detection of facial characteristics have again developed greatly in recent times. However, they also argue in the presence of poor lighting conditions for amazing pose or occlusions. A well-established group of strategies for facial feature extraction is the Constrained Local Model (CLM). Recently, they are bringing cascaded regression-built methodologies out of favor. This is because the failure of presenting nearby CLM detectors to model the highly complex special signature look affected to a small degree by voice, illumination, facial hair and make-up. This paper keeps tabs on execution to collect facial features for the Constrained Local Model (CLM). CLM model relies on patch model to collect facial image demand features. In this paper patch model built using Support Vector Regression (SVR) and Constrained Local Neural Field (CLNF). We show that the CLNF model exceeds SVR by a large margin on the LFPW database to identify facial landmarks
Author's Name: Sumantha, Sandeep Maity, Kumar Singh Mal, Subhandar
Abstract—The Cloud traffic is the major issue faced by users’ every day. Users are not able to access the files at the desired time due to the delay, caused by traffic. Traffic is induced when many users access the same network at a given point of time. This paper aims to reduce the cloud traffic by allowing the user to enter the datacenter for quick access of files with maximum capacity left instead of entering the datacenter in sequential order as in existing systems. The foremost key and challenging problem for handling big data centers in clouds are to balance the Load while flow scheduling since a huge amount of data are transferred at regular intervals throughout a thousand of customers and clients. With a rapid growth in applications, capability of utilizing the data centers has become a challenging task to cloud service, particularly during the peak time usage of data centers and when the requests of user’s are unbalanced and when there are amount of demands need to be handled. In this project, Software Defined Network (SDN) controller is applied to improvise the utilization of bandwidth of Dynamic Circuit Network (DCN) as well as reduce the delay that occurs for end users. This project presents the Genetic load balancing algorithm to provide the provider with a high utilization of bandwidth and to low the end-users dealy.
Author's Name: Manoj Gandhi, Agnihotri
Abstract— Identifying the physical aspect of the earth’s surface (Land cover) and also how we exploit the land (Land use) is a challenging problem in environment monitoring and much of other subdomains. One of the most efficient ways to do this is through Remote Sensing (analyzing satellite images). For such classification using satellite images, there exist many algorithms and methods, but they have several problems associated with them, such as improper feature extraction, poor efficiency, etc. Problems associated with established land-use classification methods can be solved by using various optimization techniques with the Convolutional neural networks(CNN). The structure of the Convolutional neural network model is modified to improve the classification performance, and the overfitting phenomenon that may occur during training is avoided by optimizing the training algorithm. This work mainly focuses on classifying land types such as forest lands, bare lands, residential buildings, Rivers, Highways, cultivated lands, etc. The outcome of this work can be further processed for monitoring in various domains.
Author's Name: Malleswari, Anumolu, Likitha, Divya
Abstract— Flood is one of the disasters which have multiple impacts on the society and industry. It has severe impacts on the urban economy and has forced the scholars to develop resiliency plans. Various types of flood forecasting techniques developed by the scholars and have certain limitations. There are various types of multiple modeling techniques which are being used for flood controlling and each has certain limitations. The optimization techniques along with the artificial intelligence algorithms can be helpful for monitoring and early prediction of flood. The neural network models promises better accuracy compared to convention models for prediction, but they face great difficulties in selection of appropriate model parameters. In the said context, here an effort has been made to explore the importance of Cuckoo theorem in flood management. The cuckoo search algorithm can be used for parameter tuning. The hybrid approach of using cuckoo search algorithm with neural networks has given far better accuracy compared to standalone algorithms. The use of such Cuckoo Search Metaheuristic algorithm will help us to predict early warning system than any other method and helps us to align the flood controlling activities. The paper presents the used of variants of cuckoo search algorithm for early flood prediction. The paper unfolds major insights of flood scenarios along with the significance of flood control and monitoring.