Volume 10 Issue 3                          July-September 2022

Implementation of Fuzzy Logic Controller Based DSTATCOM for Power Quality Enhancement in Distribution Systems [pp 01-09]

https://doi.org/10.55083/irjeas.2022.v10i03001

Country- INDIA

  Jatinder Pal Singh, Arshdeep Singh

CROSSMARK

FIND THIS ARTICLE ON

Abstract: This research work, presents Fuzzy Logic Controller (FLC) based D-STATCOM for power quality (PQ) enhancement in power distribution network. In the power distribution system, PQ is the major issue that is occurring due to non-linearity and dynamic changes in the connected loads. The proposed work utilizes FLC for generating switching Pulses for IGBT switches in the D-STATCOM to enhance quality of power in distribution systems. This research work also shows superior performance over conventional PI controllers in mitigation of harmonics by using proposed FLC topology. The proposed system is simulated with Matlab/Simulink software to ensure effective realization.

A Taxonomy of Security and Research Challenges in Cloud Computing [pp 10-23]

https://doi.org/10.55083/irjeas.2022.v10i03002

Country- INDIA

Umang Garg, Neha Gupta, Dr. Mahesh Manchanda

CROSSMARK

FIND THIS ARTICLE ON

Abstract: Cloud computing is delivered as a storage service by third party. It gains wide acceptance from various Business organizations & Information Technology (IT) Industries. Cloud computing provides various services to users through the internet; those services are like Applications, computation, and storage etc. In spite of these advantages, cloud technology faces different types of privacy and security related issues. These issues become major barriers to adopt cloud technology into various organizations. This survey paper addresses the cloud architecture, various security and privacy issues, challenges and threats, attacks, and future research directions to overcome the security and privacy related problems in the cloud environment.

Optimal Task Scheduling for Cloud Computing Using Rao Algorithm [pp 24-33]

https://doi.org/10.55083/irjeas.2022.v10i03003

Country- INDIA

Amrik Singh, Puneet Jain, Ajay Kumar

CROSSMARK

FIND THIS ARTICLE ON

Abstract: Internet-based provision of computer resources is known as cloud computing. It is possible to utilise data that is controlled by a third party or some other individual at a distant place through cloud computing. Service Level Agreements (SLAs) are used by the majority of Cloud providers to define the services they provide. As part of the SLA, the service provider promises a certain level of quality of the service. Computing and data clouds are two sorts of clouds in a cloud-based system. In cloud technology, task scheduling is critical to ensuring service quality and SLA. One of the most important aspects of cloud computing is a well-organized work schedule. In this article, we have designed an optimal task scheduling method using RAO approach. The RAO algorithm is simple, required number of parameters, and required no tuning of parameters as compared to the other algorithm. Further, a multi-objective function is designed based on RAO algorithm performs the optimal scheduling. The performance evaluation is done by considering number of tasks such as 50,100,200, 300,400, and 500. Further, number of performance metrics are determined for it. The outcomes represents that the presented technique provides lesser values of AWT, ATT, and make span over the existing method.

Analysis and Detection of Fraud in Credit Card Using SILOF [pp 34-41]

https://doi.org/10.55083/irjeas.2022.v10i03006

Country- PERU

Franklin Ore-Areche, Kelyn Nataly Muñoz-Alejo, Cledi Puma-Condori

CROSSMARK

FIND THIS ARTICLE ON

CROSSREF

SCILIT

GOOGLE SCHOLAR

Abstract: Nowadays e-commerce and online transaction is growing rapidly. For online and offline transaction most of the customer uses credit card. Credit card used globally for online transaction, buy goods, product, and payment. The rising use of credit card can increase the chances of fraud in credit card. Credit card system is at risk now. The effect of this fraudulent transaction is on the bank and institute causing a financial loss to them. For the detection of distinguish frauds, several machine learning models are utilized for better prediction. The major objective of this article is to identify the fraudulent transaction and outlier in credit card transaction. The dataset of credit card is unbalanced. There are various techniques by which fraudulent transaction can be detected and we have used these techniques such as isolation forest method, local outlier factor and support vector machine to determine fraud in credit card. We have used different matrices for enhancing the performance and accuracy. At last comparison analysis is done by using isolation forest, support vector machine, and local outlier which give the better result.

Artificial Intelligence Based Zero Trust Network [pp 42-48]

https://doi.org/10.55083/irjeas.2022.v10i03013

Country- USA

  Priya Parameswarappa

CROSSMARK

FIND THIS ARTICLE ON

CROSSREF

SCILIT

GOOGLE SCHOLAR

Abstract: Model-based security metrics are an emerging topic of cyber security research that focuses on assessing an information system’s risk exposure. We propose an end-to-end solution with the deployment of a zero-trust network utilising Artificial Intelligence in this article to understand the security posture of a system before it is rolled out and as it matures. The major part contains a discussion about the key methods and techniques which was utilized in the development process and simplified operation principles of each developed process. Some developed processes were tested practically to evaluate the problems in the processes. Modules for automatic processing and data analysis were also developed. These modules can be connected in case it is needed. The most important data collection methods were benchmarked to detect problematic situations in the operation in different realistic situations. With the perception from the benchmark test, the problematic parts of the data collection were discovered and proposals for the solution were made which could be developed and tested in the next iterations of the development process. Working Artificial intelligence-based detection and data enrichment methods were created. The results of the article allow multiple continuous research and development projects related to data collection and data analysis with statistical and artificial intelligence-based methods.

Implementation of Clustering Algorithms in IoT Devices for Prediction of Demand Bandwidth [pp 49-57]

https://doi.org/10.55083/irjeas.2022.v10i03004

Country- AZERBAIJAN

  Dr. Vugar Hacimahmud Abdullayev, Cavida Damirova, Ragimova Nazila

CROSSMARK

FIND THIS ARTICLE ON

CROSSREF

SCILIT

GOOGLE SCHOLAR

Abstract: The Internet of Things defines the networks of connected Things, enabled with sensors, actuators allowed to send, receive data over the internet. It comprises the devices from smart home appliances, health care devices, Agriculture and Industrial automation things. As the number of devices going from millions to billion, supporting quality-based service with existing infrastructure will be challenging factor. Existing dynamic bandwidth allocation techniques lacking to handle massive IoT devices, for efficient bandwidth management, it is a requirement of optimization methods, adopting machine-learning approaches to study automatically observe the usage pattern and group them to cluster. The proposed Enhanced Dynamic Bandwidth Allocation (EDBA) technique is presented to overcome this issue by providing an uninterrupted bandwidth supply and improving the quality of service in IoT devices. The EDBA algorithm is provided a bandwidth dataset from smart home-based IoT devices. Based on the utilisation level, data sets will be collected from various IoT devices. The IoT device can be clustered using a variety of clustering algorithms.