ARPN Journal of Engineering and Applied Sciences              ISSN 1819-6608 (Online)
 

 
GoogleCustom Search
 
 
 
ARPN Journal of Engineering and Applied Sciences                                    April 2015  |   Vol. 10  No. 6
   
Title:

A new secured VoIP using hierarchical threshold secret sharing

Author (s):

E. S. Thirunavukkarasu and E. Karthikeyan

Abstract:

Voice over Internet Protocol is a category of hardware and software that enables people to use the Internet as the transmission medium for telephone calls by sending voice data in packets using IP rather than by traditional circuit transmissions of the PSTN. The transmission of real time voice data is not as easy as ordinary text data and the real time voice transmission faces lot of difficulties. It suffers from packet loss, delay, quality and security. One prominent advantage of VoIP is that the telephone calls over the Internet do not incur a surcharge beyond what the user is paying for Internet access, much in the same way that the user doesn't pay for sending individual messages over the Internet. VoIP provides a protected transmission of private voice data between two endpoints. In those settings in VoIP, the participants between secret sharing scheme into a variety of levels. The hierarchical secret sharing is one of the best schemes in VoIP because it is more easy and simple to compute and implement in the real-life. An analysis of the signaling process and a study of simulation results have shown the advanced security enhancements in VoIP.

   

Full Text

 

Title:

Performance evaluation of shunt active power filters for different control strategies

Author (s):

Tirunagari Sridevi and Kolli Ramesh Reddy

Abstract:

Performance and stability of shunt active power filters improves very much with the preferred current control strategy. Supply currents are made sinusoidal with proposed control technique consisting of two sensors at the supply side and absence of harmonic detector which is present in the conventional control scheme of shunt APF. Performance of Shunt Active Power Filter was compared with three control techniques namely PI controller; PI -Vector PI controller; Fuzzy-Vector PI controller with both Six Switch Three Phase Inverter and Four Switch Three Phase Inverter. It was observed and proposed that SSTPI can be replaced with FSTPI, as it is having the comparable results of SSTPI with reduced hardware & thereby cost. The proposed technique is capable of mitigating harmonic currents and reactive power to achieve unity power factor condition at the supply side. Variations in the supply side currents in terms of harmonic distortion were presented in the paper for all the controllers at different loads and load changes.

   

Full Text

 

Title:

Behavior of castellated beam column due to cyclic loads

Author (s):

Mara Junus, Parung Herman, Tanijaya Jonie and Djamaluddin Rudy

Abstract:

The purpose of this study was to determine the behavior of castella beams reinforced with concrete due to cyclic loading, so that the beam can be used as a structural element for receiving seismic load. Test beam consists of beam castella fabricated from normal beam [CB], castella beams with concrete filler between the flange [CCB] and normal beam [NB] as a comparison. Results showed castella beam [CB] has the advantage to increase the flexural capacity and energy absorption respectively 100.5% and 74.3%. Besides advantages, castella beam has the disadvantage that lowering partial ductility and full ductility respectively 12.6 % and 18.1%, decrease resistance ratio 29.5 % and accelerate the degradation rate of stiffness ratio 31.4%. By the concrete filler between the beam flange to improve the ability of castella beam, then the beam castella have the ability to increase the flexural capacity of 184.78 %, 217.1% increase energy absorption, increase ductility partial and full ductility respectively 27.9 % and 26 %, increases resistance ratio 52.5 % and slow the rate of degradation of the stiffness ratio  55.1 %.

   

Full Text

 

Title:

A parametric study on design of helmet to mitigate brain damage and to optimize the weight

Author (s):

T. V. Hanumantha Rao, S. Srikanth and M. N. V. Krishnaveni

Abstract:

With the development of national highways for rapid transport system and also with the increase in the engine capacity of two wheeler to a great extent, there has been an increase in the rate of accidents on road. The Indian scenario in road safety is minimal and many accidents are either fatal or rider undergoes a traumatic brain injury. Helmet plays a vital role in the safety of the rider and hence it has to be designed with due consideration for stresses and deformations that cause brain damage thus effect the safety of the rider. In this paper, the effect of dynamic impact loads on the helmet and the effect of such impacts on skull and brain under high velocity of 30 m/sec which are the real time conditions were studied. In this parametric study after modeling the helmet the critical angle of impact was identified. The three layered helmet was analyzed for different material combinations. In the process of optimization, the DOE table was generated using Central Composite Design method, by using input parameters i.e., the thicknesses of 3 layers and the mass of the helmet. From the design points generated in the above stage, the response surface graphs were generated for comparing the variation in output parameter for a given input variation. The optimal values for the helmet were finalized by using Multi Objective Genetic Algorithm method by  taking constraints as  allowable stresses for the brain, the skull, the material used with due consideration of factor of safety.

   

Full Text

 

Title:

A different approach to robust automatic control for airplanes

Author (s):

Luca Piancastellia and Leonardo Frizzieroa

Abstract:

Current automatic control system uses linear mathematical models to validate automatic flight control for airplanes. Gain scheduling, non linearity and improved feedback through simulation are also introduced. Very computers operate the actuators in order to keep the airplane on the right path, in the current trim and with the proper safety margin. Some engineers are testing fuzzy control logic to control airplanes and UAVs (Unmanned Aerial Vehicles). The result is brilliant, since very simple controllers are able to fulfill the specification with little “knowledge” about the airplane performances. This means that fuzzy controllers are very robust since they are able to operate with much degraded aerodynamics or with reduced thrust. However no one was able to validate the airplane/fuzzy controller with a mathematical proof. So it is not sure that it will works in any condition. By the way the same happens for the airplane/human pilot model. So a mathematical proof is still required also for this later solution. On the other side, very accurate, time based non linear mathematical models are available for flight simulation. These models are used in several fields ranging from development to training. In recent years computers that can run these accurate models in fractions of seconds were marketed at very low prices. The idea introduced in this paper is to run an accurate mathematical model on some of these fast autopilot computer in order to optimize the sequence of commands to be inputted to the FBW system of the airplane in order to keep the path in the safest way possible. For this purpose it is necessary to have enough computing power to calculate this best solution at a rate compatible to a correct control of the airplane. In this paper we will demonstrate that these computing resources are already available and it is predictable that the computing speed of future years will allow running even more sophisticated simulators. The question may be: why use more complicated systems when current control system fulfills satisfactorily the same task in a cheaper and more reliable way? The answers are several. At first it is a matter of robustness, what happens if the yaw damper fails or the actuator of the left ailerons is unable to fulfill its task or the tail is ripped off? In this case standard systems are not able to take the airplane to the ground safely even if it is indeed possible to control the airplane by a coordinate action of the remaining control surfaces. Optimization means that it is possible to reduce the stress on structures in order to improve aircraft life, to find the control sequence that assure the mean fuel consumption or to prefer the shortest time possible to reach the required trim on the right path. In other words it is more flexible. It is also possible to monitor aircraft performance in order to evaluate external or internal disturbances. Air turbulences, wind gusts may be controlled in order to optimize structural integrity or passenger comfort. Internal disturbances, as defective functioning of components or controls, occasional failure of sensors may be diagnosed, in some cases corrected in other simply reported after landing. The reliability improvement is not the latest benefit. As a rule of the thumb more electronics or more components means less reliability with the exception of redundancy and this is the case of this paper.

   

Full Text

 

Title:

Determination of the phytogeographic affinity index of the Tatacoa dessert eco-region with other Colombian dry tropical woodland zones

Author (s):

Jennifer Katiusca Castro, Nestor Enrique Cerquera and Freddy Humberto Escobar

Abstract:

The main objective of this paper is the determination of the phytogeographic affinity index of the Ecological Region (eco-region) Tatacoa Dessert with other zones of tropical dry woodland in Colombia (TDW). This goal was achieved by conducting an extensive bibliographic review of both the number of existing vegetal species and their registration in the studied zone an common species to other countrywide territory in order to build a consolidated and to determine the similarity indexes by means of multivariate regression analysis techniques. The obtained results allowed us to establish the most suitable species in this living zone for their application to regreening labor, promoting the conservation of the native species which are appropriate for the tropical dry woodland in the Tatacoa dessert. This also calls for permitting to know the existing phytogeographic affinity among other zones in the Colombian nation so the plant cover in all the affected areas can be improved.

   

Full Text

 

Title:

Hybrid algorithm for the control of technical objects

Author (s):

Finaev Valery I., Kobersy Iskandar S., Kosenko Evgeny Y., Solovyev Viktor V. and Zargaryan Yuri A.

Abstract:

The paper is dedicated to the actual problem of modeling and development the hybrid control systems. The peculiarity of such systems consists in the combined application of methods of the classical control theory and fuzzy inference systems. The block diagrams of the hybrid control system and its operating algorithm have been described herewith. Application of the model for speed regulation of DC engine has been considered. Fuzzy control is done with the help of a PI-FUZZY controller. The basic stages of hybrid modelling have been described.

   

Full Text

 

Title:

High pressure combined solar desalination system and power cycle

Author (s):

Ramesh Kumar V.K., G. Edison, Rajkumar P.R. and Rajendran R.

Abstract:

There is a shortage of the drinking water as well as power in worldwide due to increased in population and its necessity. Solar thermal power plant and Multi Stage Flashing (MSF) system helps to overcome the need of both power and distilled water. This paper addresses the demand of the power and drinking water. The integration of power and desalination system is the main objective which will give two products as output by using solar energy. The modified Rankine cycle produces distilled water by condensing the saturated steam at the exit of the turbine or by adding turbine at the exit of the flashing system produces the power and vice versa. The theoretical analysis is made for proposed system and it is suggested for placing the turbine at the exit of flashing system in the high temperature MSF. The proposed model is analyzed in solar thermal single stage flashing system producing 1000 LPD distilled water. It results 9.33 kW of power at the flashing temperature of 160 °C and the turbine inlet pressure and temperature are 5.92 bar and 170° respectively. The high pressure on the flashing chamber avoids vacuum is the advantage in the proposed system. This proposed is used for low heat recovery and for producing both power and desalination system.

   

Full Text

 

Title:

A secured biometric framework for multimedia content protection

Author (s):

M. Gobi and D. Kannan

Abstract:

The multimedia content protection has become a boon for an IT industry nowadays. This scheme is based on layered encryption/decryption involving biometric authentication. Utilization of fingerprints as keys in encryption/decryption procedures eliminates the feasibility of illegal key sharing, which hampers the content protection schemes based solely on traditional keys. The computation times required for the necessary encryption and decryption processes are provided for AES symmetric-key system and HECC asymmetric-key system. These times show the applicability of the method. Utilization of widely available encryption/decryption systems (e.g., AES and HECC) increases the applicability even further. Custom hardware chips will reduce these times in future applications.

   

Full Text

 

Title:

An amalgamated approach of cryptography and steganography using IWT and random pixel selection for secure transmission

Author (s): V. Vaithiyanathan, B. Karthikeyan, Anishin Raj M.M., M. Rajasekhar Reddy, Priyanka S. and K. Abinaya
Abstract:

Steganography is the art of concealing the message such that even the cyber geeks do not suspect the existence of the message. Cryptography is the technique of secret writing especially in the form of code and cipher systems but the presence of message is known. This paper, presents a secured data transformation by blending Steganography and cryptographic techniques together to improve the standard of data security such that none other than the intender and the receiver will be able to extract the proper data. The integer wavelet transformation is used to transform the data into an unintelligible format. Transformed data are embedded using least significant bit substitution into the cover image. The intensity of the pixel range which is frequent in image is selected and infixing of data is done on this selected range of pixels. This process is also reversible. This improves the efficiency in secured transmission as it approaches different patterns each time the same data embedded in different images. This algorithm can be best utilized to encrypt passwords and keys which are transmitted by 3rd party. It can also be considered to store digital signatures in the database. 

   

Full Text

 

Title:

Effect of dye residue on some properties of cement

Author (s):

Olekwu Benjamin Elah

Abstract:

This is paper studies the effect of using dye residue (DR), as a partial replacement of cement, on some properties of cement. The main compound composition of cement, the setting times, drying shrinkage and soundness of cement were the main focus of the paper. DR-cement pastes containing 10, 20, 30, 40, and 50 percents of DR as partial replacement of cement together with cement paste (with zero percent DR) as control were prepared. Results of the investigations carried out show that the use of DR as a partial replacement of cement increased the percentage composition of tricalcium aluminate (C3A), and dicalcium silicate (C2S) in the resulting mixture from 10.83% to 15.18% and 16.35% to 80.00% respectively for 50% DR but decreased that of tetracalcium aluminoferrite (C4AF) and tricalcium silicate (C3S) in the mixture from 9.12% to 5.26% for 50% DR and 54.10% to 0.80% for 20% DR. It also increased both the initial and final setting times of the DR-cement mixtures from 105 minutes to 165 minutes and 183 minutes to 243 minutes, respectively at DR content of 20%. The use of DR as a partial replacement of cement was found to be effective in reducing the drying shrinkage of the resulting DR-cement mixtures and is capable of removing unsoundness in cement.

   

Full Text

 

Title:

Exudate detection and feature extraction using active contour model and sift in color fundus images

Author (s):

V. Ratna Bhargavi and V. Rajesh

Abstract:

In the world, Diabetic Retinopathy is the leading cause of vision loss. Early symptoms of this disease are exudates, so early diagnosis and treatment at right time is very important to prevent blindness. In this paper the Active contour model (ACM) is implemented to detect exudates and it is used to obtain accurate borders of lesions, and  then the  local features of detected exudates are extracted using Scale invariant feature transform (SIFT). The publicly available DiaretDB1 database of color fundus image set is used for testing the implemented method.

   

Full Text

 

Title:

Identifying microaneurysms in retinal images using Fuzzy C-Means Clustering

Author (s):

Ganesh  Naga Sai Prasad V., Habibulla Khan and E. Gopinathan

Abstract:

The identification of MAs is an important phase in the research and grading of suffering from diabetes retinopathy. Analysis of online cross-section profiles in accordance with the regional highest possible pixels of the preprocessed image in microanyrism identification in retinal pictures. The statistical measures of these features principles as the alignment of the cross-section changes represent the function set that is used in a naïve Bayes category to remove unwarranted applicants. In this document we present clustering strategy to identify the microanyrisms from the optic disk and cup in the retinal fundus pictures. Fuzzy C-Means (FCM) Clustering is used for clustering the information in which the information factors are grouped with different account level. The first and major phase is preprocessing function, in which the optic cup and hard drive of the feedback picture is being turned. Originally the optic hard drive is turned in some position and the range between the information factors is calculated and a group is established in accordance with the centroid. The centroid and information factor along with the group can be recognized in each phase then the typical set of factors is grouped together. This procedure carries on until no more centroid is discovered. The group with more information factors that do not coordinate with the unique picture is regarded as the retinal picture with microanyrism illness. The experimental outcomes determines efficient and precise discovering microanyrisms in retinal pictures with great possibilities in picture pixel spinning.

   

Full Text

 

Title:

Compressive strength and slump flow of self compacting concrete uses fresh water and sea water

Author (s):

Erniati, Muhammad Wihardi Tjaronge, Rudy Djamaluddin, Victor Sampebulu, Nurmalasari and Darmawansyah Tri Sakti Darwis

Abstract:

SCC concrete has high fluidity that is able to flow and to pass without segregation of reinforcement material and fill the spaces in the mould with little or no compaction process, resulting in a more impermeable concrete. This paper discusses the effect of sea water as mixing water on workability of the fresh state of the SCC (slump flow test and T50) as well as an increase the compressive strength of the SCC using sea water and fresh water as a comparison. Test of Slump flow and T50 using standard EFNARC. The compressive strength was tested using ASTM standard 39 / C 39M - 99. Tests were conducted at ages 1, 3, 7, 28 and 90 days. The results showed that the slump flow of SCC using sea water is smaller than that using the freshwater SCC. SCC uses sea water has an excellent workability, segregation and aggregate does not seem equitable. The increase in compressive strength in SCC uses sea water has differences with SCC using fresh water at the age of 1, 3, 7, 28 and 90 days by 11%, 9%, 3%, 0% and 0%. The increase in compressive strength of concrete SCC uses sea water very quickly at the early age at the age of 1 and 3 days compared with an increase in the compressive strength of concrete SCC uses fresh water. The increase in compressive strength SCC uses sea water at the age of 1 to 28 days is greater than the SCC that uses fresh water, but otherwise at the age of 90 days.

   

Full Text

 

Title:

Common rail diesel-electric propulsion for small boats and yachts

Author (s):

Luca Piancastelli and Leonardo Frizziero

Abstract:

The marine propulsion system is the heart of the ship. Its reliability will directly affect the safe navigation and operating costs of ship and its overall safety. The individuation of the best propulsive solution is one of the key technologies in marine field.  Focusing on the study of comprehensive reliability, this study analyses operation environments of the marine propulsion system firstly, and then evaluate the comprehensive reliability of the chosen marine propulsion system. According to the fault tree of the marine propulsion system, a CRDID (Common Rail Direct Injection Diesel) electric hybrid marine engine system is taken as an example The result shows that a new engine CRDID-hybrid system can be reliably installed on small boats and yachts. It is believed that the knowledge gained in this study will provide a theoretical reference for research on comprehensive reliability of hybrid marine propulsion systems.

   

Full Text

 

Title:

Evolutionary algorithm for intelligent hybrid system training

Author (s):

Finaev Valery I., Beloglazov Denis A., Shapovalov Igor O., Kosenko Evgeny Y. and Kobersy Iskandar S.

Abstract:

Here we considered the evolutionary algorithm of the adaptive hybrid control system training. This algorithm differs by the application of combined operators of random changes and the possibility of dynamic correction of operators parameters based on information about solution population. We defined the types and parameters of operators of random changes: crossing-over, mutation, and re-initialization. The combination of operators is considered. Also we presented the algorithm of parameter adaptation for the combined operators of casual changes and developed the structure of parallel genetic algorithm.

   

Full Text

 

Title:

Effective management of bus transportation through design of a fuzzy expert system

Author (s):

Abhinav S. V. and Krishna Anand S.

Abstract:

With rapid raise in population, vehicular traffic has increased in leaps and bounds. Road accidents have been on the rise every day. The system is more acute especially in metros and large towns. Besides, an exponential improvement in technology has lead to a fast moving world. The aim of organizations is to carry out work in a fast and efficient manner and at the same time cutting down the expenses incurred. With this perspective in mind, a fuzzy expert system has been designed for bus management system. This system considers a large set of input parameters and frames decisions. The chief focus of the system is to provide a threshold limit beyond which the number of accidents as well as the expenditure incurred can be cut down. The unique feature of the system is to simulate relationships between parameters which are difficult to compute mathematically. A specific instance of the relationship between speed of the vehicle and wear and tear has been clearly illustrated.

   

Full Text

 

Title: An expressive HMM-based text-to-speech synthesis system utilizing glottal inverse filtering for
Tamil language
Author (s):

Sudhakar B. and Bensraj R.

Abstract:

This paper describes an Hidden Markov Model (HMM) based speech synthesis system that make use of  the Glottal Inverse Filtering (GIF) for producing natural sounding synthetic speech in Tamil language. GIF based method is used for parameterization. Tamil speech is first parameterized into spectral and excitation features in the proposed system. The HMM system is trained by utilizing the speech parameters and then generated from the trained HMM according to the given Tamil text input. In this proposed work the voice sources are glottal flow pulses extracted from real speech, and the voice source is additionally customized according to the all-pole model parameters produced by the HMM. Experimental results show that the proposed system is accomplished of generating natural sounding speech, and the quality is obviously better compared to a system exploiting a conventional impulse train excitation model.

   

Full Text

 

Title:

Three layered bar model architecture for stock market component analysis

Author (s):

S. Sudharsun, K. R. Sekar, K. S. Ravichandran and J. Sethuraman

Abstract:

Stock market is a place where the companies mobilize money from the people to run their business and in turn benefit people in terms with dividend and profit. Stock market has been an aggregation of both buyers and sellers. As the stock market value increases, the market capital of corresponding firm increases and thus benefiting the investors. Sometimes there may be a chance of downfall in their business which will cause the investors to lose their investment. If the company is not running successfully, the stock price may go down. The reason for investing in the stock market is to earn more profit in a short period of time. Plenty number of stock market shares are available in the existing market. People always find difficulty in choosing a right company shares for their investment. It’s a right time for us to make some big analytics, to guide the investors on where to invest their hard earned money. For analysis, umpteen numbers of methodologies are available at our disposal. Two of the methodologies like K-Medoids (Crisp) and Fuzzy K- Means (Soft Computing Techniques) are employed for market analysis. We propose ‘BAR Model architecture’ for stock market analysis using three layered segments where acronym BAR refers to Budget, Analysis and Result. Budgeting is an entry level to identify the class in the data set. On applying distributed measures on a given data set we get what is called as Budget. After applying the above said methodologies what we get is called Actuals. Both Budget and Actual were compared for variance using Chi-square and ANOVA Test. As the variance we get is very minimal it proves that either methodology is not needed for this kind of application. We come to the conclusion through this paper that the Budget proves to be right.     Purity levels of the attributes were measured through Gini Index. This innovative approach will lead us to achieve Predictive Accuracy and Reliability. For the past one decade, this kind of mammoth data collection and analysis have never been reported which has been accomplished in this paper.

   

Full Text

 

Title:

Power quality improvement by voltage control using Dstatcom in Matlab

Author (s):

R. Thilepa, S. Devakumar and D. J. Yogaraj

Abstract:

This paper proposes a new topology by Distribution Static Compensator using Matlab. This proposed method of power quality improvement achieves UPF which is not possible in previous methods. Maximum UPF is maintained, while regulating voltage at the load terminal, during fluctuation of load. Dstatcom solves Power quality issues by achieving PF correction, harmonic elimination, load balancing, and voltage regulation based on the load requirement.

   

Full Text

 

Title:

The development model analysis of copper using vapor heat by formation copper hydride compounds for HV-AAS method

Author (s):

Muhammad Naswir

Abstract:

The copper hydride an intermediate hydride which decomposed to copper atom and hydrogen at temperature 110oC. In the research, resulted copper atom is measured on heated open absorption cell. The copper hydride was formed by reduction of Copper (II) with hypophosporus acid 4% at temperature 80 oC. Sodium lauryl sulfat 0.01 M was used as a misel production to carry out copper hydride to absorption cell of hot vapour atomic aborption spectrometry (HV-AAS). The optimum condition of experiment were; 0.5 ml of  H3PO2  4%, 80±5 oC of reaction temperature, 0.5 mL of sodium lauryl sulfat 0, 01 M, 60±5 second of formation time, and 15.35 cm long of glass a pipe. Argon or air for blowing was conducted at the base of flask reaction with 324.7 nm wavelengh measurement. The duration of measurement was ± 35 seconds. Regresion equation of standard copper absorbance 5 to 30 mg/L was Y = 0, 026 + 0, 0091x. By using the equation, the characteristics of analytical procedure were found: 3.65 mg/L detection limit, 0.18 mg/L sensitivity. The accuration of the procedure is 1.41 %, its precision 10.59 %, repeatability is 3.79 at 10 mg/L concentration of solution and average value of confidence interval 95 % was 9.86 ± 0.8 mg/L or at range 9.1 to 10.7 mg/L. Linearity range standard of concentration is 5 to 25 mg/L (absorbance 0.088 to 0.258). The analysis procedure characteristics obtained with HV-AAS is less effective than flame atomic absorption spectrometry (FAAS) as its detection limit is 0.004 mg/L and its sensitivity is 0.03 mg/L.

   

Full Text

 

Title:

Calculation of the particle velocity in cold spray in the one-dimensional non-isentropic approach

Author (s):

A.N. Ryabinin

Abstract:

The mathematical model of the motion of gas particles in De Laval nozzle in the one-dimensional non-isentropic approximation is considered. The model takes into account the exchange of momentum and energy between the gas and solid phases. We obtained a system of ordinary differential equations for the parameters of the gas and particle velocity and temperature. For the particular case of air as a carrier gas and copper particles, the system of equations is solved by the Runge-Kutta method. Inlet pressure was equal to 2.5∙ 106 Pa, inlet temperature was equal to 773 K. For particles of different diameters, the particle velocity and the temperature were calculated at the nozzle exit both in the isentropic and non-isentropic approximations. The ratio of particle and gas mass rates varied up to 20 %.  For small particle of 8 microns in diameter, exit particle velocity decreases from 691 m/s to 641 m/s, exit particle temperature increases from 113 K to 143 K, while ratio of mass rates arises from 0 to 20 %. For large particles, velocity difference is less than for small ones.

   

Full Text

 

Title:

Design of automation scripts execution application for seleniumwebdriver and testng framework

Author (s):

Rishab Jain C and Rajesh Kaluri

Abstract:

To develop and deliver software to the customer, validating its quality is highly important. Software testing can be performed manually or using automation tools to identify defects, assess the quality of the product and gain confidence in the software being developed. Automation tools helps in design and execution of testscripts saving time and cost involved in manual testing. This paper mainly focuses on the automation testing tools currently available to support design and execution activity, challenges faced by manual tester in executing automation scripts, approaches in executing of automation scripts using TestNG and its disadvantages and then overview of the proposed web application which overcomes the problems faced by manual testers, reduce the time spent on initial set-up activity to carryout testscripts execution and overcome disadvantages of execution using TestNG.

   

Full Text

 

Title:

Comparision of routing algorithms implemented for streaming applications in wireless sensor networks

Author (s):

M.R. Ebenezar Jebarani and T. Jayanthy

Abstract:

Wireless sensor network applications require tiny sized sensors with short transmission communication or signaling range which reduce the chances of detection. These size constraints cause limitations on CPU speed, amount of memory, RF bandwidth and battery lifetime. And therefore, efficient communication techniques are essential for increasing the lifetime and quality of data collection and decreasing the communication latency of such wireless devices. In this paper, the performance of a Quality of serviced based routing scheme (QUES) is compared with other existing algorithms and it is proved that QUES algorithm performs better than other existing routing algorithms namely SPIN, PEGASIS and WEED by transmitting different types of movie for experiment. The test results show that QUES algorithm performs better related to the parameters bandwidth, delay, error rate, and percentage of data loss, Connection establishment time and the number of hops used in the communication path.

   

Full Text

 

Title:

Influence of molarity on physical properties of spray pyrolysed transparent conducting Cdo thin films

Author (s):

V. Saravanan, J. Joseph Prince and M. Anusuya

Abstract:

Transparent highly conducting CdO thin films were coated on glass substrates by homemade Chemical Spray Pyrolysis technique (CSPT). The X-ray diffraction studies were carried out the CdO films was found to be cubic polycrystalline structure with preferential reflection of (111) plane. The crystallite size was determined by Scherer formula and is in the range of 6-23 nm. The SEM analysis indicated the film prepared at higher molarity with two different magnification exhibited porous nature of the surface and nano clusters are interconnected to form a nanorod of CdO. From UV-VIS-NIR studies, the absorption coefficient was found and connected with the photon energy in order to value the direct band gap energy (2.17 eV-1.99 eV). Depending on the molarity, Hall measurement showed the electrical resistivity and mobility at 300K varied in the range 16.5 x10-3 Ωcm to 1.23 x10-3 Ωcm and 11.7 cm2/Vs to 34.2 cm2/Vs.

   

Full Text

 

Title:

A multidimensional assessment framework for house buyers’ requirements of green homes

Author (s):

Radzi Ismail, Fazdliel Aswad Ibrahim, Mohd Wira Mohd Shafiei and Ilias Said

Abstract:

Developers are one of the construction industry players that looking into occupants’ needs during the housing development processes. The growing awareness on the need of sustainable or green construction practice become a cornerstone to built better environment which fulfill the occupants’ needs in the modern era without sacrificing the environmental protection, preservation and conservation. This paper aims to develop a multidimensional assessment framework for house buyers’ requirements of green homes in the Malaysia housing development. The framework could answer a multitude of problems in developing green homes in Malaysian housing industry. Sample of the study is defined as house buyers who attended property fairs in six states and one federal territory in Malaysia, namely, Kedah, Penang, Kelantan, Melaka, Johor, Sabah and Kuala Lumpur. The respondents were selected through the convenience sample technique. Out of 2600 questionnaires distributed, 1642 were answered and returned with a response rate of 63 per cent. The data was analyzed by utilizing descriptive statistic, factor analysis, correlation, and multiple regression analysis. This study found that green homes consist of six main elements, namely community design and planning, efficient usage of resources, use of alternative resources, natural system, protection and safety, and reusing and recycling approach. The result shows that better requirements of green homes among house buyers in Malaysia towards higher implementation of green home principles. The multiple regression analysis shows that all the variables could significantly foresee the house buyers’ requirements of green homes of house buyers in Malaysia.

   

Full Text

 

Title:

An Italian experience on crash modeling for roundabouts

Author (s):

Orazio Giuffrè, Anna Granà, Tullio Giuffrè, Roberta Marino and Tiziana Campisi

Abstract:

In the last few years a considerable amount of safety models and evaluation tasks have been developed and specifically dedicated to roundabouts. Several safety performance functions (SPFs), indeed, have been implemented for roundabouts worldwide. Since SPFs are developed using crashes, traffic volume and other characteristics of a specific site (or geographical area), their direct transferability to other contexts different from those in which SPFs were calibrated is not always possible and, in any case, it must be done very carefully. A safety performance function cannot be used without a transferability evaluation for sites not included in the geographic area for which it was developed. Starting from these considerations, this paper aims to calibrate a safety performance function for urban roundabouts in Italian context expanding a sample data already used in a previous work by Giuffrè et al. (2007); this SPF is then compared with other safety performance functions found in literature, testing the transferability.

   

Full Text

 

Title:

Common rail diesel - automotive to aerial vehicle conversions: an update (part I)

Author (s):

Luca Piancastelli and Leonardo Frizziero

Abstract:

Back to the 1997 when this activity began, it was generally thought that CRDIDs would have completely replaced the piston gasoline engines used in aircrafts within a decade. This fact did not happen for several reasons. This paper tries to individuate these reasons. The more updated solutions to the many problems that almost stopped this application are also introduced. In this first part FADEC related issues are introduced. Torsional vibration control is also briefly discussed.

   

Full Text

 

Title:

Efficiency of genetic algorithms in intelligent hybrid control systems

Author (s):

Beloglazov Denis A., Finaev Valery I., Zargarjan Jury A., Soloviev Victor V., Kosenko Evgeny Y. and Kobersy Iskandar S.

Abstract:

The paper is devoted to features of genetic algorithms application in intelligent hybrid control systems. We demonstrate a general view over the model of hybrid adaptive control system. Also we considered interaction and tuning of elements of the hybrid adaptive control system. Tuning of elements is carried out with the use of genetic algorithms. We designed ten genetic algorithms for research. Researches are carried out when optimization of a multivariable function by genetic algorithms on the example of learning neural network emulator and neuro-fuzzy controller. In the end we made some conclusions about efficiency of genetic algorithms.

   

Full Text

 

Title:

Productivity modeling of precast concrete installation using multiple regression analysis

Author (s):

Ali Najafi and Robert Tiong Lee Kong

Abstract:

Precast concrete products are generally used to shorten project duration and provide higher quality and more sustainable construction projects. There are many factors affecting productivity in precast concrete construction sites and there is a lack of research in terms of estimation tools for prediction of precast installation times for different components that are widely used in precast projects (walls, columns, beams, and slabs). Therefore, this study was designed to study the erection of different precast panels and develop a regression model to estimate the installation times based on the selected factors (extracted from literature, interviews, and site visits) involved in different stages of installation process namely preparation, lift, and fixing activities. The results showed the appropriateness of the model to be used by site managers and general estimators for their planning purposes. This study contributes to the construction management knowledge by providing simple but effective models to predict the installation times of precast elements. Significant factors involved in each stage of precast installation were discussed and limitations and recommendations for future research were presented.

   

Full Text

 

Title:

A study on the efficiency of high voltage PTC heaters for electric vehicles

Author (s):

J. W. Jeong and Y. L. Lee

Abstract:

The development of electric vehicles has attracted significant international attention due to increasing environmental problems. Therefore, the mileage of electric vehicles becomes an important concern. However, air-conditioning systems have adverse effects on the mileage of electric vehicles. This study conducted a numerical analysis for the optimization of the efficiency of a PTC heater fin and verified the thermal efficiency of the PTC heater through numerical analyses and experiments. The results showed that a fin pitch of 1.3mm yielded the best efficiency, and the PTC heater maintained a minimum efficiency of 93% in the considered ambient temperature range.

   

Full Text

 

Title:

An agent based simulation study of association amongst contestants in crowdsourcing software development through preferential attachment

Author (s):

Nitasha Hasteer, Abhay Bansal and B K Murthy

Abstract:

Software development is creative, challenging and ever evolving. With the increasing deployment of cloud technologies and benefits of crowdsourcing, an emerging form of software development is Software Crowdsourcing. The members of the crowd use various platforms to participate in competitions of software design and development to earn reputation and reward. In this paper we analyze and model the association amongst contestants in a software crowdsourcing platform to earn reputation. Agent based modelling is being used to simulate actions of agents (contestants) and measure the resulting system behaviour and outcomes over time. We model the preferential attachment behavior amongst the contestants and analyze the data retrieved from a crowdsourced software platform. This research proposes that agents that compete together for a certain task are more likely to be associated with each other for future competitions.

   

Full Text

 

Title:

Numerical simulation of film cooling over flat plate

Author (s):

Ali S.Baher Eneel and Abdulhafid M. Elfaghi

Abstract:

The effect of film cooling over flat plateis investigated using the commercial CD code; Fluent 6.3. The computational domain includes the coolant supply tube as well as the mainmixing region. A tube L/D of 4 and injection angles of (30o, 60o, and 90o) were employed for blowing ratio of (0.33, 0.5, and 1.67), and a density ratio of 1.14. Adiabatic film cooling effectiveness distributions were also determined for inline and staggered arrangements. The main observation from this study that the 30o hole gave larger effectiveness values than 60o and 90o at the blowing ratio of 0.33 with the same length-to-diameter ratio. The maximum effectiveness was achieved with a blowing ratio of 0.5. Theresults show that the increase of blowing ratio negatively affects film cooling, such that for the blowing ratio of 1.67 the injected coolant tends to lift off from the wall due to the increase of the wall normal momentum. The comparisons for numerical results with experimental data are presented.

   

Full Text

 

Title:

Analysis of dense and sparse patterns to improve mining efficiency

Author (s):

A. Veeramuthu

Abstract:

Generally, data mining concept is used to gather information from various data repository. Frequent pattern mining is to be designed for displaying repetitions in the transactional database. Patterns are defined by predefined format. In this evolutionary work, the proposed concept to mine the transactional database using the combination of recommendations and prediction by the help of software simulation. In hardware side, this process is explained in pattern mining using systolic tree creation. This will handle pattern mining to configure the frequent pattern while generating systolic tree structure. But it can handle certain size of dataset only, but also generate more candidate item set when implementing the item set matching by tree projection algorithm. This will occupy more and more memory, each time reevaluation done from the scratch of dataset in hardware side. It is required more time to process. To overcome this problem, in software side to implement HI-Growth tree technique to analyze large scale of dataset. The new concept is introduced based on recommendation approach to avoid candidate set generation. This method is achieved to reduce the internal memory and mining time is by dividing the frequent pattern into the dense and the sparse patterns.  In this paper,  investigate the mining speed of the HI-Growth tree is fast-paced than original software side algorithm of FP-Growth tree, and also it consumes less amount of memory for analyzing dense and sparse pattern through recommendation technique to improve the mining efficiency, while achieving the higher throughput to overcome the defect of hardware approach.

   

Full Text

 

Title:

An approach for software security using digital rights management

Author (s):

R. senthilkumar and Arunkumar Thangavelu

Abstract:

To take prudent secures to the product is battle work. The examination of maintaining a strategic distance from the unapproved access to the product will be carried out by the learning based security utilizing Digital Rights Management. drm is a standard code. The DRM will implant with source code which will shield from theft. Information based DRM and biometric based DRM are the two routines utilizing as a part of security procedure. In the proposed framework, learning based security is given in the product by utilizing DRM. The DRM application is expounded and situated with the inquiries, based on the relative responses for the craving definitive data the idea will work. This idea was conveyed to the shopper amid programming deals. Set of general inquiries will be made and spared in the source code and it will implant with application programming. The gathering of learning based answers will be put away in the made database. Those answers will insert with the database utilizing DRM. At the point when the client needs to utilize the product it will make inquiries and we have to give right answers those given answers will match to the database. When it matches with the database answers it will allow the client to utilize programming. On the off chance that it crisscrosses with the database it won't permit utilizing programming. Utilizing this product it will give better secure

   

Full Text

 

Title:

An improved motion estimation search algorithm for h.264/avc standard

Author (s):

Vani Rajamanickam, Sangeetha Marikkannan and Sharmila Ganesan

Abstract:

The virtual role of science and technology in modern life demands compression in multimedia application as it involves transfer a large amount of data. Motion estimation is one of the most important and complex block of all the existing video coding standards. In the video coding standard H.264/AVC, Motion Estimation is allowed to search multiple reference frames and the ME process is much more complex due to variable block size with quarter pixel accuracy. Therefore, efficient motion estimation algorithm is required to reduce the computational complexity. The improved Search algorithm is suitable for stationary, quasi-stationary and fast moving video sequences and computationally less complex. The results show that the proposed algorithm requires very few number of search points for finding the best matched block with almost negligible loss in video quality. As compared to the existing ME algorithm, the simulated results of the proposed algorithm achieved an average of 11.145 search computations with less time and an average PSNR of 23.41dB for a frame rate of  15fps.

   

Full Text

 

Title:

Evaluation of work postures - the associated risk analysis and the impact on labor productivity

Author (s):

Chowdury M. L. Rahman, Syed Misbah Uddin, M. A. Karim and Mohiuddin Ahmed

Abstract:

Various musculoskeletal disorder (MSD) symptoms can be experienced by the workers performing their tasks in bad work postures which are largely static and consequently these are associated with long term risks and injuries. These postures also have a bad impact on work performance and labor productivity. In this regard, this case study research work has been conducted in a selected ceramic factory of Bangladesh with the aim of evaluation of work postures of workers working in the production section of the factory through rapid upper limb assessment (RULA) and their impact on labor productivity. The secondary objective of the research work was to draw an analysis of associated risks with the bad work posture. In order to evaluate the work postures of workers, RULA technique has been used. RULA is a widely used tool developed for the assessment of work postures which specifically examines the level of risk associated with the upper limb disorder of individual workers by scoring the different body region of the workers. The results obtained from this research work have been plotted into three main sections, namely the identification of good or bad work posture and the level of risks associated with poor work posture and their impact on labor productivity. It has been identified that most workers have been exposed to the upper limb discomfort which in turn contributes to the risk of injuries during the performance of work. Through the analysis of RULA, it has been revealed that no posture is found risk free during the investigation of work postures of workers. According to RULA grand score of 7, 43.59% of the workers need immediate investigation and changes indicating that the level of exposure to postural risks is very high and immediate ergonomics intervention to decrease the exposure to risk level seem essential. The consequence of bad work posture results in musculoskeletal disorders (MSDs) which have also been analyzed in this research work. The most commonly affected body regions found among the 39 listed workers are shoulders (92.31%), neck (71.79%), wrist (71.31%), lower back (43.59%) and upper back (41.03%). Lastly, the correlation between RULA grand score and labor productivity has been shown. The graphical analysis reveals that there is a decreasing trend of labor productivity with the higher RULA grand score establishing the fact that there exists an inverse relationship between average RULA grand score and average labor productivity.

   

Full Text

 

Title:

A framework for security in data mining using intelligent agents

Author (s):

Sharath Kumar. J and Maheswari. N

Abstract:

Nowadays it is possible to outsource data mining needs of a corporation to a third party.  An establishment in the corporate world without much number or expertise in computational resources can outsource their mining needs. But the data as well as the association rules defined over it are the sole property of the company and thus privacy and security needs to be preserved. Also, partitioned databases are capable of simplifying the complexity of massive data as well as improving the overall performance of the system. In this paper we devise a scheme that ensures the privacy of data, incorporating database partitioning to ensure a highly efficient and secure system with new algorithm for privacy preservation. This is combination of L-diversity and P-sensitive technology. Agent Technology is also introduced in the given system. Different agents are used for different task like mining agent, data agent, task agent, user agent etc. and they communicate with each other and work together to provide a heuristic solution.

   

Full Text

 

Title:

A novel prediction model for academic emotional progression of graduates

Author (s):

Venkatramaphanikumar S, K Prudhvi Raj, D S Bhupal Naik and K V Krishna Kishore

Abstract:

According to the present day necessity of the universities in anticipating the placement and career opportunities of the students, there is a need for better assessment and prediction tools based on various dimensions of the student. In this regard, Multilayer Perceptron (MLP) based prediction model is suggested to predict the job opportunities of the Undergraduate students by considering the student’s Academic aspects such as Major, Discipline, Working Nature, Academic History (X  and pre-university), Regularity, No. of failed courses, Degree of Intelligence, Discipline and Current GPA; Co-curricular aspects such as Project accomplishment, Certification courses, Workshops and Presentations, pre-placement training attendance, Pre-Placement Test performance, Communication Skills; Behavioral aspects such as Introvert, Extrovert, Team work attitude and other aspects such as Family background, Career objective. This system is designed to improve the accuracy in prediction of performance of the students, who are having low probability in getting a job. Thus, the outcome can be used to take few proactive measures such as conducting additional training classes and remedial counseling for enhancement of probability in getting placements. To evaluate the performance of the proposed model, collected data voluntarily from 153 final year engineering graduate students of Vignan’s University, India. Prediction accuracy of Multilayer Perceptron outperforms other classification methods.

   

Full Text

 

Title: Abundance of Thrips palmi Karny and the phenomenon of Thrips sp. (Thysanoptera: Thripidae) attack as pest and virus vectorat vegetables plantation in Jambi region
Author (s): Asni Johari
Abstract:

Thrips sp. are insect pests that mostly damage a variety of crops such as vegetables. The attack caused thrips starting from mild to severe attacks. The attack will be more severe if the thrips act as vectors. One of the species that has potential as viral vectors is Thrips palmi Karny. The abundance of thrips population also influences the level of attack. In Jambi Province have been no reports about the abundance of Thrips palmi Karny and Thrips attack phenomena sp. on vegetable crops. The study aimed to analyze the abundance of Thrips palmi Karny and thrips attack phenomena sp. on vegetable crops. The study was conducted by a survey on vegetable crops in lowland and highland regions of Jambi. Thrips were collected from a variety of vegetable crops at each location. The obtained Thrips were collected in 70% alcohol, and then made microscopic slides ​​to identify the type of thrips. Further it was an analysis of the abundance of Thrips palmi Karny that exist in each of obtained sample. The observations of the phenomenon thrips attack carried out on 50 pepper plants in the infested cage thrips. Observation and analysis carried out on the morphology, chemistry and Elisa test to the affected leaves. Thrips palmi Karny had the highest abundance in lowland and highland was happened in the plant of Solanum melongena, then Cucumis sativus. The observations in captivity found thrips attacks occurred on the upper leaf surface, happened at the base, the middle and the edges of the leaves with silvery attacks. The attacks of chili thrips on the leaves of plants lowered chlorophyll content and damage the leaf cell structures. Levels of nitrogen, fats and carbohydrates of the attacks chili thrips on the leaf were not significantly different from the control leaves at 5% level. Elisa test results showed that the leaves attacked thrips obtained from plantation vegetables did not contain Tospovirus.

   

Full Text

 

Title:

A combined face recognition approach based on LPD and LVP

Author (s):

Kabilan R, Ravi R, Rajakumar G, Esther Leethiya Rani S and Mini Minar V C

Abstract:

Face recognition is mainly used to identify the person by comparing the facial features. To extract the facial feature several techniques are used. In this paper a novel local pattern descriptor is used to extract the features. This feature extractor is called local vector pattern (LVP). The LVP used in this paper extract the features in high-order derivative space for face recognition. The LVP is mainly used to reduce the high redundancy and feature length increasing problem. The feature length increasing problem is solved by a comparative space transform. It is used to encode various spatial surrounding relationships between the referenced pixel and its surrounding pixels. The linking of LVPs is compacted to produce more distinctive features and reduce the redundancy problem. The LVP extracts the micro patterns encoded through the pair wise directions of vector by using an effective coding scheme called Comparative Space Transform (CST) for successfully extracting distinctive information. The histogram intersection methods are used for evaluating the similarity between the spatial histograms of two distributions extracted from the LVP and recognize the face image.

   

Full Text

 

Title:

A survey on collaborating techniques and QOS based recommendation system

Author (s):

N.Kannammal, S.Vijayan and R.Sathishkumar

Abstract:

The immense growth in internet technologies leads to increase in size of the web service repository. Normally, all the service requestor expects very qualitative resultant web service for their request. These requestors may not have previous knowledge about their requesting domain. So it is difficult for them to filter out the relevant web service from huge pool of data. Moreover, the resulting of irrelevant services for the user request will affect the user satisfaction. Recommender system is being widely used to recommend products or items to consumer. This system can also be used to recommend a service or a list of service to service requestor. Collaborative filtering technique (CF) is one the efficient recommending system that recommends the service based on the past users experiences or ratings on that service. The past users are the nearest neighbors to the requestors. Traditional CF does the user-based and item-based similarity computation between the users and items for recommendation. They do not take into account nonfunctional components (QOS parameters) of the service which greatly have impact on performance. This paper is a review about CF technique and need of QOS parameter for the recommendation system to improve the performance.

   

Full Text

 

Title:

Actualizing xml clustering using web search engine

Author (s):

Vaishnavi. S and Nimala. K

Abstract:

Searching is an extremely monotonous Process on the grounds that, we all be giving the distinctive keywords to the web crawler until we land up with the Best Results. There is no Clustering Approach is accomplished in the Existing. Feature determination includes distinguishing a subset of the most helpful peculiarities that delivers good results as the first whole set of features. The FAST clustering lives up to expectations in two steps. In the first step, features are separated into groups by utilizing chart theoretic clustering techniques. In the second step, the most illustrative feature that is emphatically identified with target classes is chosen from each one group to structure a subset of features. XML based grouping Formation is attained to have Space and Language Competency. Information can be transferred in any database position that may change over into xml format. It is utilized to evacuate immaterial and undesirable features. Characteristic collaboration is essential in certifiable application. Fast clustering based feature is actualized in this module to perform clustering procedure. Active clustering is actualized so as to demonstrate the results one by one, so we reason a group of results from which the client can choose gathering of results. This methodology is acquainted with expansion the productivity of the framework and procedure of enhancing in machine adapting and data mining.

   

Full Text

 

Title:

An efficient liver segmentationusing kernel sparse coding automated (KSCA) approach

Author (s):

Rajesh Sharma R and Marikkannu P

Abstract:

Computed Tomography (CT) images have been widely used for diagnosis of liver disease and volume measurement for liver surgery or transplantation. The approach is presented with respect to liver segmentation, but it can be easily extended to any other soft tissue by setting appropriately the values of the parameters for the splitting and merging algorithm and for the region growing refinement step. Sparse coding with data-adapted dictionaries has been successfully employed in several image recovery and vision problems. A novel, automated segmentation technique for detecting affected region in liver was proposed in this paper. In the new approach, we constructed ensemble kernel matrices using the pixel intensities and their spatial locations, and obtained kernel dictionaries for sparse coding pixels in a non-linear feature space. The resulting sparse codes were used to train an Extreme Learning Machine (ELM) classifier that determines if a pixel in the image belongs to an affected region. From the experimental results using ten test datasets distributed for the competition, it was confirmed that our method kernel sparse coding based liver segmentation performs better than previous methods or models.

   

Full Text

 

Title:

An improved vlsi architecture using wavelet filter bank for blur image applications

Author (s):

Kabilan R, Ravi R, Jenniefer J Sherine R, Rajakumar G and Mini Minar V C

Abstract:

An effective image compression technique using 2D-Discrete Wavelet Transform (DWT) is proposed. It has been implemented using a 6-tap Daubechies filter bank for providing reduced adder count and path delay. The proposed architecture first acquires feature points by local binary pattern (LBP). Then, they are encoded by wavelet filter bank and blur noise is removed from the image. The algebraic integer (AI) technique provides a simple representation for the irrational basis coefficients of the transform. This compressed image is reconstructed using inverse feature transform. The compression performance (CP), objectively peak signal to noise ratio and subjectively visual quality of image are measured and it is found that they outperform the existing method. The proposed method can be used in medical imaging.

   

Full Text

 

Title:

An optimized event based software project scheduling with uncertainty treatment

Author (s):

Sarojini Yarramsetti and G. Kousalya

Abstract:

Software organizations every day meet new challenges in the workflow of different projects. Scheduling the software projects is important and challenging for software project managers. Efficient Project plans reduce the cost of software construction. Efficient resource allocation will obtain the desired result. Task scheduling and human resource allocation were done in many software modeling. Even though we are having large number of scheduling and staffing techniques like Ant Colony Optimization, Particle Swarm Optimization (PSO), Genetic Algorithm (GA), PSO-GA, there is a need to address uncertainties in requirements, process execution and in resources. But many of the resource plans was affected by the unexpected joining and leaving event of human resources which may call uncertainty. We develop a prototype tool to support managing uncertainties using simulation and simple models for management decisions about resource reallocation. We also used some real-world data in evaluating our approach. This paper presents, a solution to the problem of uncertain events occurred in the software project planning and resource allocation. This paper presents a solution to the uncertainties in human resource allocation.

   

Full Text

 

Title:

Analysing the effect of interference in wireless industrial automation system (WIAS)

Author (s):

R.Nagarajan and R.Dhanasekaran

Abstract:

ZigBee is a wireless standard recommended for low-data rate wireless personal area networks. ZigBee is widely used in much wireless Monitoring and control application domains due to its low cost, low power and implementation simplicity. ZigBee can work in a non-beacon-enabled mode using un-slotted Carrier Sense Multiple Access/Contention Avoidance (CSMA/CA) or a beacon-enabled mode using slotted CSMA/CA with or without guaranteed time slots (GTSs). GTSs can be allocated by the network coordinator to devices which require specific bandwidth reservation. Currently, there are many chip vendors, including Maxstream, Digi etc., producing commercially available products adopting the ZigBee specification. Wireless networks used in the industrial domain are expected to perform their operations smoothly under such a broad range of stringent operations conditions. As to the sensing networks designed for such purposes must consider the issues of co-channel interference, signal loss or fading due to metallic machinery, the presence of obstacles, the effects of variations of operating temperature, pressure, humidity, impact of noise and vibrations generated from engines, boilers, rotations of machinery, airborne contaminants etc., on the sensing and data communication ability of the network. This paper deals with the investigation of interference effects in Wireless Industrial Automation System (WIAS).

   

Full Text

 

Title:

Insitu assessment of electromagnetic interference between rfid systems and medical devices

Author (s):

M.Periyasamy and R.Dhanasekaran

Abstract:

The objective of the proposed work is to conduct in situ assessment of electromagnetic interference between radio frequency identification systems (RFID) and medical devices. Two RFID systems, one belongs to passive category working at 13.56 MHz and another belongs to active category operating at 2.5 GHz were considered. Ten medical devices including electrocardiogram monitor, ventilators, defibrillators and infusion pumps were tested. The tests were conducted in accordance to procedures specified in ANSI standard C63.18 for adhoc on site testing. Based on the results obtained, it was found that except distortion in the pulse oximeter by 2.5 GHz system at very close distance (5 cm), none of the devices affected by presence of two RFID systems tested.

   

Full Text

 

Title:

Machine learning approach for medical diagnosis

Author (s):

Manickapriya. S and Nimala. K

Abstract:

To overcome clustering problem we use affinity propagation (AP) clustering to handle dynamic data. To handle this, it is important to find the difficulty of incremental affinity propagation (AP) clustering. In AP clustering the newly arrived objects are clustered by adjusting the current data. The message passing concept (MPC) are used for the data communication with each other to produce cluster in parallel, it is used for effective error correction.

   

Full Text

 

Title:

Ontology based text document summarization system using concept terms

Author (s):

R. Ragunath and N. Sivaranjani

Abstract:

In this modern world, due to the dramatic technological development huge amount of information is available in all over the places. So it is difficult to understand the main content of the document without reading the entire document. It takes time, based on the amount of information available in the document. By using the automatic summarization, these problems are solved. In this paper ontology based text summarization system using concept terms is introduced. Concepts are extracted using concept extraction algorithm. By using the ontology model the hierarchical representation is generated for the concept terms. Then by setting the concept depth the required summary is generated.

   

Full Text

 

Title:

Performance evaluation in speed control of classic converter fed switched reluctance motor using pi controller

Author (s):

Muthulakshmi S. and Dhanasekaran R.

Abstract:

The SRM is used in various industrial applications due to its beneficial advantages. However the robustness of SRM is the main drawback, which severely affects the dynamic performance of motor .Thus the aim of the paper, is to control the speed of switched reluctance motor using PI controller. The controller is designed and simulated by MATLAB/SIMULINK. The use of PI controller in the outer loop gives the superior performance of the motor drive. The dynamic performance of the SRM is controlled by PI controller during starting period under different load condition. This paper shows the effect of load disturbance, speed variation and motor parameter like, stator winding resistance, inertia of the motor on the speed of switched reluctance motor. The simulation results revealed that the effectiveness of PI controller on the motor performance.

   

Full Text

 

Title:

Testing of faults in VLSI circuits using online bist technique based on window of vectors

Author (s):

Navaneetha Velammal M., Nirmal Kumar P. and Getzie Prija A.

Abstract:

Built in Self-Test (BIST) provides an attractive solution for testing embedded bocks and combinational circuits. It performs testing during normal operation of the circuit. There are several BIST schemes and the main parameters are the hardware overhead and time consumption. In the existing technique RAM module is used to store the test vectors. The hardware overhead is high because the size of the RAM grows proportion to the input vectors. To overcome the limitation, the RAM module is replaced by proposed module. And to reduce the time consumption, window of vectors is used. The proposed method uses online BIST technique based on window of vectors that performs testing during the normal operation of the circuit.

   

Full Text

 

Title:

Identification of relevant documents considering unlabelled documents

Author (s):

Subin. V. B and Sivaranjani. N

Abstract:

Active learning tackles data scarcity problem by choosing unlabelled data for labeling and training. Active learning handles large volume of data selection. Data are diverse in character or wide range. There is a problem of handling unlabelled data and certain predefined category. This can be overcome by developing a method which is flexible to handle large volume (diverse in content) are learned through single platform or group of item rather than individually. Performance analysis using data mining approaches validates accuracy and F measure, combines precision and recall and takes data relevant to query that are successfully retrieved and  efficiency of active learning leading to reliable and authentic predictions.

   

Full Text

 

Title:

A comprehensive analysis in PID tuning with soft computation in paper industry

Author (s):

M. Senthil Kumar and K. Mahadevan

Abstract:

In this paper, Particle Swarm Optimization algorithm (PSO) method has been applied on a moisture control system for auto tuning (PID) parameters. Proportional – Integral – Derivatives control scheme is used to provide efficient and quiet easier in control engineering applications. Most of the PID tuning methods are used in manually which is difficult and time consuming.  PSO Algorithm which leads to improved efficiency of tuning of process. The proposed algorithm is used to tune the PID parameters and its performance has been compared with Fuzzy logic techniques. Compare to fuzzy logic technique dynamic performance specifications such as rise time, peak time and peak overshoot optimal values produced by PSO. The plant model represent by the transfer function is obtained by the system identification tool box.

   

Full Text

 

Title:

A fuzzy logic based energy management system  for a micro grid

Author (s):

S. D. Saranya, S. Sathyamoorthi and  R. Gandhiraj

Abstract:

This paper proposes an approach for the hybrid solar photovoltaic and wind power system in Battery management for stand-alone applications. Battery charging process is non-linear, time-varying with a considerable time delay so it is difficult to achieve the best energy management performance by using traditional control approaches. A fuzzy control strategy for battery charging or discharging used in a renewable power generation system is analyzed in the paper. To improve the life cycle of the battery, fuzzy control manages the desired state of charge (SOC). A fuzzy logic-based controller to be used for the Battery SOC control of the designed hybrid system is proposed and compared with a classical PI controller for the performance validation. The entire designed system is modelled and simulated using MATLAB/Simulink Environment.

   

Full Text

 

Title:

A reliable vector control method: IFOC for three phase induction motor drives using SVPWM

Author (s):

M. B. Joseph Gerald and K. Mahadevan

Abstract:

The vector control of ac drives has been broadly used in high performance control system. Indirect field oriented control (IFOC) is one of the most efficient vector control of induction motor due to the simplicity of designing and construction. This paper presents the performances of three phase induction motors using space vector pulse width modulation (SVPWM) scheme. The SVPWM system is entrenched with the two control loops, the inner current control loop and the outer speed control loop using PID controller. Both systems were run and tested using MATLAB/SIMULINK software. The simulation results demonstrate that the SVPWM can improve the feature of the stator current and reduce the torque ripple while keeping the other performance characteristics of the system.

   

Full Text

 

Title:

A review on data privacy protection and types of attacks in cloud computing

Author (s):

Satheeshkumar R. and  Kannamal N.

Abstract:

In recent years, lots of organizations have adopted their systems for enabling cloud based computing to provide scalable, virtualized on-demand privilege to a shared pool of computing resources such as networks, servers, storage, applications and services. Mainly cloud computing technology enables users/enterprises to eliminate the requirements for setting up of expensive computing infrastructure and reduces systems’ operating costs. So, this type of technology was used by more number of end users. On the other hand, existing invulnerability deficiencies and vulnerabilities of underlying technologies can leave an open door for intrusions. Therefore, cloud computing providers need to protect their users’ sensitive data from insider or outsider attacks by installing an intrusion detection and prevention system. In this paper, it was aimed to define different attack types, which affect the availability, confidentiality and integrity of resources and services in cloud computing environment. Additionally, the paper also introduces related interrupt detection models to identify and prevent these types of attacks.

   

Full Text

 

Title:

A survey on liver tumor detection and segmentation methods

Author (s):

R. Rajagopal and P. Subbaiah

Abstract:

Liver tumor is a pathological disorder of the human that affects around 50 million people worldwide.  The early detection and diagnosis of liver tumor is important for the prevention of liver tumor.  Many techniques have been developed for the detection of liver tumor using the abnormal lesion size and shape.  This paper reviews various lung tumor detection algorithms and methodologies used for lung tumor diagnosis. The novel methodology for the detection and diagnosis of liver tumor is also proposed in this paper and its experimental results are compared with various methodologies for the detection and diagnosis of liver tumor.

   

Full Text

 

Title:

A very short term wind power forecasting using back-propagation algorithm in neural networks

Author (s):

Priyadarshni S., Booma J., Dhanarega A.J. and Dhanalakshmi P.

Abstract:

This paper presents an application of Artificial Neural Networks-Back Propagation (ANN-BP) for wind power forecasting. The need for accurate forecasting keeps on increasing as power demands and power markets are becoming more competitive and complex structure in integrating wind power into the grid power system. This paper presents a model for wind power forecasting for the very short term scheduling.

   

Full Text

 

Title:

An effective mitosis recognition and segmentation tool for stem cell exploration

Author (s):

R. Nathiya and G. Sivaradj E.

Abstract:

Stem cells, on regenerative medicine, has enormous potential and impact, lead to the rapidly growing interest for tools to analyze and characterize the behaviors of these cells in- vitro in an automated and high throughput fashion. Measurement of the proliferative behaviors of cells in- vitro is important to many biomedical applications for the measurement of the accurate counting and localization of occurrences of mitosis, or cell division, in a cell culture. In this paper, the performance analysis of clustering for segmenting the mitosis detection is proposed. It is possible to manually identify incidents of mitosis because mitotic cells in culture tend to exhibit intensified surrounding halos under phase contrast illumination. This halo artifact is eliminated by using Diffusion corona filter. Using this method of segmentation precision of 97.1% is obtained which is 1.3% higher when compared with the semi Markov process of segmentation.

   

Full Text

 

Title:

Analysis of 16-bit carry look ahead adder – A subthreshold leakage power perspective

Author (s):

Amuthavalli G. and Gunasundari R.

Abstract:

Power is an inevitable curb on digital design of emerging technologies. The down-scaling of transistor geometric paves the way for the curtailment of power consumption. Out of all the leakage components, subthreshold leakage current is the major shell out in static power dissipation. The subthreshold leakage power is analyzed in conventional circuit of 16-bit Carry Look Ahead Adder (CLA). The paper aims on a novel concept of Short Pulse Power Gated Approach (SPOGA), a leakage power reduction technique implemented particularly for low duty cycle (Example: Wireless Sensor Networks, Burst Mode type, etc.,) applications. The values of power consumption of the circuit are interpreted from the transient analysis of the circuit using 90nm technology in Cadence GPDK.

   

Full Text

 

Title:

Analysis of carbon NANO structures for on-chip interconnect application

Author (s):

P. Murugeswari, A. P. Kabilan,  S. Rohini  and P. Pavithra

Abstract:

This paper proposes the carbon nano structures particularly Carbon nanotube (CNT), Graphene Nanoribbon (GNR), with excellent electrical, thermal and mechanical properties making them an emerging alternative for future on-chip interconnect applications. Analysis of CNT and GNR as on- chip interconnect has been performed with the help of existing equivalent circuit model. Performance metrics such as delay, bandwidth, power delay product (PDP) have been considered. Performances of carbon nano structures (CNT and GNR) are better than Cu interconnect at all levels of interconnect, even when the technology scales below 22nm. The Single Layer GNR and Single Walled CNT exhibit only 0.5% and 0.7% of the delay observed in copper interconnects respectively.  Extreme reduction in power dissipation has also been justified with the results. Thus it obeys Moore’s law even when technology scales into tens of nanometer.

   

Full Text

 

Title:

Booth recoded WALLACE tree multiplier  using NAND based  digitally controlled delay lines

Author (s):

B. Kayalvizhi,  N. Anies Fathima and  T.Kavitha

Abstract:

Digital controlled delay line (DCDL) is a digital circuit used to provide the desired delay for a circuit whose delay line is controlled by a digital control word. There are wide varieties of approaches available for constructing the DCDL. The previous approach deals about designing a DCDL with and without glitches. More over Glitches are the most considerable factor that limits the use of DCDL in many applications. The Glitches in a circuit can be analyzed by increasing delay control code in a circuit. By reducing the number of glitches a delay line also further reduced. In this paper NAND based DCDL improved using Wallace tree multiplier, which used to give an accurate value, as well increase speed of operation. It aims at additional reduction of latency and area of the Wallace tree multiplier using the delay control units based on the DCDL unit. The simulation have been carried out using modelsim and xilinx tools.

   

Full Text

 

Title:

Coagulation performance evaluation of natural and synthetic coagulants in waste water treatment

Author (s):

M. Senthil Kumar, G. V. T. Gopala Krishna and V. Sivasankar

Abstract:

The current method uses various chemicals which significantly synthesize byproducts which may pollute the environment and may detoriate the ecosystem at a slow rate. Our study is using various natural and synthetic materials for the process of coagulation which will be eco-friendly and efficient.    In this study attempts the investigation of the coagulation performance of some natural & synthetic materials to remove the suspended particles in waste water. The removal of suspended particles as a function of time, dose & initial turbidity arte explored. The attempt on turbidity removal followed by the removal of Total Dissolved Solids (TDS) has been envisaged. The removal of TDS by continuous flow column techniques is planned using TiO2 mixed sand. Effluents from Textile Industry & Sewage Water are planned to be treated by adopting the above removal techniques. The synthetic coagulant which we had a higher coagulation efficiency and can be used for treating higher turbidity effluents.

   

Full Text

 

Title:

Coupled Inductor Based  DC-DC Converter for High Step-Up Application

Author (s):

K. Radha Lakshmi  and R. Dhanasekaran

Abstract:

In this paper, a coupled inductor based high step-up dc–dc converter for high step-up applications is proposed. The concept is to utilize two capacitors and one coupled inductor. The two capacitors are charged in parallel during the switch-off period and are discharged in series during the switch-on period by the energy stored in the coupled inductor to achieve a high step-up voltage gain.  In addition, the energy stored in the coupled inductor is recycled; the voltage stress of the main switch is reduced. The switch with low resistance RDS(ON) can be adopted to reduce the conduction loss and  the reverse-recovery problem of the diodes is alleviated. Not only lower conduction losses but also higher power conversion efficiency is benefited from lower turns ratios. The operating principle and steady-state analyses are discussed in detail. Finally, A 200W Converter Operating at 50KHZ with 12V input and 120V output simulation is presented to demonstrate the performance. The results are verified through MATLAB Software.

   

Full Text

 

Title:

Design and simulation of voltage booster circuit using coupled inductor

Author (s):

P. Muthukrishnan and R. Dhanasekaran

Abstract:

In this paper, a high voltage gain of DC-DC converter with design and simulation are proposed using coupled-inductor. The proposed converter duty ratio is 0.65, so appropriate duty ratio is considering for this design of the converter. Due do more number of switches are considers for the converter circuit will make more switching power losses but in this converter using only two switches and have low voltage stress across power switches. The recycling processes are takes place in the coupled inductor, because of this energy stored in leakage inductor. The steady-state analyses and the operating principles with modes of operations of proposed converter are discussed properly in below detail. Finally the proposed converter design and simulation output are obtained in terms of output voltage is 271 voltages from the input of 24 voltage of the DC battery supply and output power of 407W and efficiency is 96.6% in designed and simulated using MATLAB/ SIMULINK.

   

Full Text

 

Title:

Design of auto-gated flip -flops based on self gated mechanism

Author (s):

S. Sangeetha, and A. Sathya

Abstract:

Radiation hardening by design has become a necessary practice when creating circuits to operate within radiated environments. While employing RHBD techniques has tradeoffs between size, speed and power, novel designs help to minimize these penalties. Space radiation is the primary source of radiation errors in circuits and two types of single event effects, single event upsets, and single event transients are increasingly becoming a concern. While numerous methods currently exist to nullify SEUs and SETs, special consideration to the techniques of temporal hardening and interlocking are explored in this work. Temporal hardening mitigates both SEUs and SETs by spacing critical nodes through the use of delay elements, thus allowing collected charge to be removed. Interlocking creates redundant nodes to rectify charge collection on one single node. In this paper presents an innovative, D Flip-Flop in CMOS design. TheFlip-Flop physical design is laid out in the nm process in the form of an interleaved multi-bit cell and the circuitry necessary for the Flip-Flop to be hardened against SETs and SEUs is analysed with simulations verifying these claims. Comparisonare made to an unhardened D Flip-Flop through speed, size, and power consumption depicting how our technique used increases all three over an unhardened Flip-Flop. Finally, the blocks from both hardened and unhardened Flip-Flop being placed in work and run in 4-bit counter design flows which are compared through size and speed to show the effects of using the high density multi-bit layout.

   

Full Text

 

Title:

Design, Construction and Performance Analysis of Low Cost Fixed Bed Biomass Gasifier

Author (s):

G. Sreelal

Abstract:

Liquefied petroleum gas (LPG) is one of the most convenient sources of fuel for cook stoves. The main reasons why LPG is widely adopted for house hold are: it is convenient to operate, easy to control, and clean to use because of the blue flame emitted during cooking. However, because of the continued increase in the price of oil in the world market, the price of LPG fuel had gone up tremendously and is continuously increasing at a fast rate. With this problem on the price of LPG fuel, research centres and institutions are challenged to develop a technology for cooking that will utilize alternative sources other than LPG. The potential of biomass as alternative fuel source to replace LPG is a promising option. Henceforth this project work focus on fabricating an environmental friendly, low cost, fixed bed (down draft) biomass gasifier that completely utilize producer gas and converts it into efficient energy resource.

   

Full Text

 

Title:

Evaluation of capacitance of conducting bodies for electromagnetic modeling

Author (s):

M. Dhamodaran and R. Dhanasekaran

Abstract:

This paper represents the evaluation of capacitance of different conducting bodies are calculated using finite element method (FEM). There are different numerical methods like Finite Difference method, Method of Moment and Monte Carlo methods. But the finite element method having more advantages. The surfaces are discretized using triangular subsections. Finite element method is a suitable method for computation of capacitance in metallic surface. The accurate estimation of electrical parameters is essential for Electromagnetic modeling and Antenna design. In this paper, capacitance of the square plate, rectangular plate, elliptical plate and circular plate are computed using finite element method. For simulation COMSOL multiphysics software used. FEM is suitable and efficient method for computation of electrical parameters. We evaluate some of our simulation results with other available results in the literature and good matching the results.

   

Full Text

 

Title:

Experimental and Investigation of Micro Electric Discharge Machining Process of AISI 1040

Author (s):

T. Ponvel Murugan and T. Rajasekaran

Abstract:

There is an increasing demand for industrial products, not only with the increased number of functions and also there will be a requirement of product in reduced size. Hence, it is essential to develop a product with maximum functions and minimum size. Micromachining technology gives the best solution to develop the product with maximum number of functions and also the size of the product will be in micrometers range. Micromachining technology uses various machining techniques to launch miniaturized products more efficiently and well ahead of their competitors in the market. One of the machining techniques involved in micromachining is Micro Electrical discharge machining (Micro – EDM). Micro-EDM uses the same working principle as EDM which produces repetitive discharges of electrical sparks between the gap of tool (electrode) and the work piece. AISI 1040 steel is a high carbon steel which provides high yield strength and also it is employed in making spring materials, cutting saws, blades and in micro level applications it is used in manufacture of micro grippers, micro actuators. The wear rate is also less compared to the copper and graphite electrodes employed previously in EDM machining. In this present work, optimization of micro electrical discharge machining parameters using Taguchi’s approach is proposed for AISI 1040 steel because of its higher hardness and also economically feasible to produce dies at cheaper cost.  Experimentation was planned as per Taguchi’s L9 orthogonal array. Each experiment was performed under different machining conditions of gap voltage, capacitance, feed, and threshold. Two responses namely material removal rate and surface roughness were considered for each experiment. The optimum machining parameter combination is obtained by using the analysis of signal to noise (S/N) ratio. The level of importance of the machining parameters on the material removal rate and surface roughness is determined by using analysis of variance (ANOVA).The highly effective parameters on both the MRR and surface roughness are found as gap voltage and capacitance. The variation of the MRR and surface roughness with machining parameters is optimized by using Taguchi technique and gray relational analysis technique with the experimental values.

   

Full Text

 

Title:

Independent component analysis based on blind source separation by using Markovian and invertible filter model

Author (s):

Navaneetha Velammal M.,  Nirmal Kumar P. and Surya Priyanka P.

Abstract:

In this process blind sources are analyzed independently and the independent component analysis separates the underlying sources from the given mixture. Before to this process many more methods are used for blind sources they have non-Gaussian and sample dependences, this method can exploit both properties jointly. This proposed system uses mutual information rate that is used to analysis and derivation of algorithms. In this process, two types of source models are used for entropy rate estimation they are Markovian and another one is invertible filter model that gives the general independent component analysis (ICA). Under the Markovian source model, the entropy rate equals the difference between two joint entropies. Under the invertible filter source model, the source is generated by an invertible filter that is driven independently and identically distributed random process, the entropy rate of the source equals the entropy of the driving process under some constraints. The proposed Fast ICA algorithm is presented for Entropy estimation by using MATLAB2009 Software.

   

Full Text

 

Title:

Forwarding group node selection in mobile ad hoc networks using intelligent data analysis

Author (s):

Vigneshwaran P. and Dhanasekaran R.

Abstract:

Mobile Ad hoc Networks (MANET) is identified as an emerging field for the researchers to work on, as there is a huge increase of mobile users nowadays.  Applications such as emergency searches, recues, military battlefields, etc., uses the MANET, as it is not possible to establish a fixed network for communication. Especially multicast routing has been preferred to satisfy such needs since same information can be transmitted to a group of users. Even though, many multicast routing protocols had been proposed, still the performance of the multicast routing protocol is lacking to achieve reliability and scalability in MANET while transmitting a packet to multiple users. In this paper, we have proposed a multicast routing protocol to improve the Packet Delivery Ratio (PDR) with minimum control overhead and delay based on intelligent data analysis techniques such as Radial Basis Function (RBF). The main aim of the analysis is used to identify the probability of optimum forwarding group node based on the information augmented with the JOIN_QUERY packet. The performance evolution shows that the proposed approach ensures the PDR and reduction in control overhead significantly.

   

Full Text

 

Title:

Genetic algorirhm based selection of wheeling transactions

Author (s):

A. Parthasarathy  and  R. Dhanasekaran

Abstract:

This paper describes, how to select a particular wheeling option among the various feasible transaction options available under de-regulated environment of modern power systems. An efficient GA-optimal power flow (GA-OPF) algorithm has been proposed to determine the optimal selection based on wheeling cost. In this proposed GA-OPF, Newton-Raphson method and GA algorithm have been used for power flow and economic dispatch respectively. Based on the power transfer capability and minimum generation cost, an optimal wheeling option will be suggested to both the owners of private non-utility generator (i.e. independent power producers or co-generators) and the utility. The proposed algorithm is independent of the cost characteristics of non-utility generators (NUGs). The proposed model has been tested on the IEEE 30 bus test system with synthetic imposition of wheeling transactions. The solutions   obtained are quite encouraging and useful in the present de-regulated environment.

   

Full Text

 

 

     

  

   

  

  

  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

       arpnjournals.com                                                       Publishing Policy                                                  Review Process                                           Code of Ethics