No. |
Title and Author |
Area |
Country |
Page |
251 |
Planning, Scheduling and Resource Optimization for A Villa by Using Ms-Project 2010
-Pradeep ; Dr Rajendra S
One of the most challenging jobs that any manager can take on is the management of a large-scale project and it requires numerous activities throughout the project. A Modern project management has developed various techniques based on network techniques in order to plan the projects processes in time, their costs and resources. Critical Path Method (CPM) is one of the best procedures for the planning, scheduling and optimizing the resource usage. CPM scheduling is a basic project control tool hence it is using in the all type of project. Completing a project on time and within the budget is not an easy task. The project scheduling plays an important role in the time and cost aspects of a project. Each project managers have different systems in planning and scheduling, which is usually consists of Gantt chart or Bar chart. The development of Critical Path Method provides a basic and systematic approach to project managers. This results in the use of software’s like Microsoft Project and Primavera. Read More...
|
Civil Engineering |
India |
1085-1089 |
252 |
Studies on Physico - Mechanical Properties of Chloroprene Rubber Vulcanizate for Belting Application
-A.Murugesan ; B.Rajkumar; R.Baskaran; R.Arichandran
Among the vulcanized elastomers, the chloroprene rubber (Neoprene) possesses a good performance being one of the most used in the current days. Compounding was carried out in a two-roll mill and vulcanized at 150°C. However, this kind of polymer is seriously playing a vital role for the manufacture of power transmission belts in the automotive industry. A worldwide method that has been used and that is an important tool in the rubber vulcanization in hydraulic press curing at high temperature in which the chloroprene compound has virtual physical and mechanical properties. In this work, the chloroprene samples were prepared according to ASTM standards. The Rheological and the physico-mechanical properties of CR vulcanizate were studied. Read More...
|
Rubber Technology |
India |
1090-1093 |
253 |
A Survey of Online Credit Card Fraud Detection using Data Mining Techniques
-Shrutiben Jayantibhai Patel ; Palak V. Desai
Nowadays the use of credit card has increased, because the amount of online transaction is growing. With the day to day use of credit card for payment online as well as regular purchase, case of fraud associated with it is also rising. To reduce the huge financial loss caused by frauds, a number of modern techniques have been developed for fraud detection which is based on data mining, neural network, genetic algorithm etc. Here a survey of techniques for online credit card fraud detection using Hidden Markov Model, Genetic Algorithm and Hybrid Model, and comparison between them has been shown. Read More...
|
Computer Engineering |
India |
1094-1096 |
254 |
Lid driven cavity flow simulation using CFD & MATLAB
-Jagram Kushwah ; K. C. Arora; Manoj Sharma
Steady Incompressible Navier-Stokes equation on a uniform grid has been studied at various Reynolds number using CFD (Computational Fluid Dynamics). Present paper aim is to obtain the stream-function and velocity field in steady state using the finite difference formulation on momentum equations and continuity equation. Reynold number dominates the flow problem. Taylor’s series expansion has been used to convert the governing equations in the algebraic form using finite difference schemes. MATLAB has been used to draw to flow simulations inside the driven-cavity. Read More...
|
Mechanical Engineering |
India |
1097-1099 |
255 |
Modified Golomb Code For Integer Representation
-J.Nelson Raja ; P.Jaganathan; S.Domnic
In this computer age, all the computer applications handle data in the form of text, numbers, symbols and combination of all of them. The primary objective of data compression is to reduce the size of data while data needs to be stored and transmitted in the digital devices. Hence, the data compression plays a vital role in the areas of data storage and data transmission. Golomb code, which is a variable-length integer code, has been used for text compression, image compression, video compression and audio compression. The drawback of Golomb code is that it requires more bits to represent large integers if the divisor is small. Alternatively, Golomb code needs more bits to represent small integers if the divisor is large. This paper proposes Modified Golomb Code based on Golomb Code, Extended Golomb Code to represent small as well as large integers compactly for the chosen divisor. In this work, as an application of Modified Golomb Code, Modified Golomb Code is used with Burrows-Wheeler transform for text compression. The performances of Golomb Code and Modified Golomb Code are evaluated on Calcary corpus dataset. The experimental results show that the proposed code provides better compression rate than Golomb code on an average. The performance of the proposed code is also compared with Extended Golomb Codes (EGC). The comparison results show that the proposed code achieves significant improvement for the binary files of Calgary corpus comparing to EGC. Read More...
|
Computer Applications |
India |
1100-1104 |
256 |
A Review Paper on Stereo Vision Based Depth Estimation
-Radhika J. Raval ; Mahasweta Joshi; Bhavesh Tanawala
Stereo vision is a challenging problem and it is a wide research topic in computer vision. It has got a lot of attraction because it is a cost efficient way in place of using costly sensors. Stereo vision has found a great importance in many fields and applications in today’s world. Some of the applications include robotics, 3-D scanning, 3-D reconstruction, driver assistance systems, forensics, 3-D tracking etc. The main challenge of stereo vision is to generate accurate disparity map. Stereo vision algorithms usually perform four steps: first, matching cost computation; second, cost aggregation; third, disparity computation or optimization; and fourth, disparity refinement. Stereo matching problems are also discussed. A large number of algorithms have been developed for stereo vision. But characterization of their performance has achieved less attraction. This paper gives a brief overview of the existing stereo vision algorithms. After evaluating the papers we can say that focus has been on cost aggregation and multi-step refinement process. Segment-based methods have also attracted attention due to their good performance. Also, using improved filter for cost aggregation in stereo matching achieves better results. Read More...
|
Computer Engineering |
India |
1105-1108 |
257 |
Secure Mining of Association Rules in Horizontally Distributed Databases
-SHIVANAND PATIL ; PRATIK MENDRE; SAGAR POKHARKAR; SUKHADA VAVHAL
We suggest a protocol for secure mining of association rules in horizontally distributed databases. The existing primary protocol is that of Kantarcioglu and Clifton [1]. Our protocol, like theirs, is rely on the Fast Distributed Mining (FDM) algorithm of Cheungetal, which is not a secured distributed version of the Apriori algorithm. The major ingredients in our protocol are two novel safe multi-party algorithms—one that calculates the combination of private subsets that each of the interacting players have, and another that tests the insertion of an element contained by one player in a subset contained by another. Our protocol offers enhanced privacy with respect to the protocol in [1]. In count, it is simpler and is signiï¬cantly more effective in terms of interaction rounds, communication charge and computational cost. Data mining techniques are used to discover patterns in huge databases of information. But sometimes these patterns can disclose susceptible information about the data holder or persons whose information are the subject of the patterns. The idea of privacy-preserving data mining is to recognize and prohibit such revelations as evident in the kinds of patterns learned using traditional data mining techniques.[5]. Read More...
|
Computer Engineering |
India |
1109-1111 |
258 |
A Survey on Data Mining Techniques for Crime Hotspots Prediction
-Neha Patel ; Prof.Shivani V. Vora
A crime is an act which is against the laws of a country or region. The technique which is used to find areas on a map which have high crime intensity is known as crime hotspot prediction. The technique uses the crime data which includes the area with crime rate and predict the future location with high crime intensity. The motivation of crime hotspot prediction is to raise people’s awareness regarding the dangerous location in certain time period. It can help for police resource allocation for creating a safe environment. The paper presents survey of different types of data mining techniques for crime hotspots prediction. Read More...
|
Computer Engineering |
India |
1112-1114 |
259 |
A Review: Study of Heat Transfer in Fins
-Bharti Sharma ; Ritika Tripathi; Pooja Kumari
Extended surfaces are used to increase heat transfer rate and this study related to extended surfaces (fin) and heat transfer rate. This review summarizes the previous work on various shapes of fins. In this study, discuss heat transfer in different geometry of fins because the heat transfer rate depends on the changing of the fins materials, fins cross-section, fin profiles, air velocity etc. So this review helps to select the fins according their requirements. Read More...
|
THERMAL ENGINEERING |
India |
1115-1116 |
260 |
An Analysis of Rule Based Mining for Protein Phosphorylation
-S.Padmapriya ; R.Indra; J.Sengottuvelu
The study of proteins and its structure is a vast and complex subject. Earlier there has been lots of effort to classify and categorize the proteins structure. The focus of the current study is to generate rules based on information extraction models using data mining techniques. The architecture should not have computational overheads and the rule-based Information Extraction engine should implement all the features and display the patterns in a consistent mode. In normal information extraction networks, it tends to transmit rules in response to extracellular protein structure stimuli and other intracellular balance changes. The current work focuses on protein phosphorylation information, but the IE pipeline based model architecture can be instantly ported to the extraction of types other than phosphorylation. The rules generated are shown as graphs for analysis purposes. Read More...
|
Computer Science and Engineering |
India |
1117-1119 |
261 |
An Efficient Protocol for Predicting User Behaviour Using Call Detail Data
-B.Praveena ; R.Indra; J.Sengottuvelu
Wireless Mobile devices has led to the explosion of the computing world. This huge information can be utilized by observing the mobile network’s behavior. The big data thus generated in the mobile networks enable us to gain useful insights into the user patterns by using big data analysis techniques and computing techniques. The proposed model introduces uses data set information, data analysis techniques to categorize the data set in the network communication into two types, one is user based and the other one is network oriented. The essential data is then computed into by the telecommunication operators who are facing the tremendous challenge to provide satisfactory service to mobile users with varying QoS requirement. Thus by including high volume media transmission, huge amount of machine to machine(M2M) connectivity the data is summarized and reviewed to form temporal and spatial analysis to data mining and statistical test. The call measurement and call detail record, respectively, to understand the base station behavior. The users behavior is revealed and predicted by comparing the base stations location and real-world map and determine the behavior of the users. Read More...
|
Computer Science and Engineering |
India |
1120-1122 |
262 |
survey paper on coupling measurement tools in object oriented software
-Sampada Prakash kale ; Prof. V. S. Bidve
In software engineering, coupling is the manner and degree of interdependence between software modules. It is a measure of how closely connected two routines or modules are, the strength of the relationships between modules. Low coupling is a sign of a well-structured computer system and a good design. A large numbers of metrics have been projected for measuring properties of object-oriented package like size, inheritance, cohesion and coupling. As object oriented analysis and design techniques become widely used, the demand on assessing the quality of object-oriented designs substantially increases. Recently, there has been much research effort to develop and empirically validate metrics for OO design quality. The coupling measurement metrics are more useful when they are correctly measured. This paper contains the survey of types of coupling measurement tools that are available till the date. Read More...
|
Software Engineering |
India |
1123-1126 |
263 |
A Survey on Node Usage Probability
-Smruti Khadse ; Prof.T A.Chavan
Communication networks are rapidly becoming the need of time. Internet being the best example to describe. However, traffic congestion is the major cause in degrading the overall performance of the network.Node usage probability is a concept from a complex network perspective for uniformly distributing the traffic load and frequently choosing a node to relay packets in a network. The concept of node usage probability comprises of effective network design strategies, routing algorithms and resource allocation schemes, which helps improve the overall traffic performance. Node usage probability is a metric used wherein the nodes in the network are used efficiently as the problems arise when the nodes are overused or sometimes not used at all inducing congestion in the network thereby hampering the overall performance. The performance of a minimum-node-usage routing algorithm is compared with that based on other routing algorithms, such as shortest path (SP) and minimum degree (MD) routing algorithms, that possess attributes like network topologies and resource allocation schemes that show routing algorithms based on minimizing node usage (MNU) can balance the traffic load effectively also the resource allocation technique based on the node usage probability shows that the technique outperforms the uniform and degree-based allocation schemes. The analysis, as well as the results, gives an idea of what topology to be used, the routing method and the resource allocation scheme, for achieving optimal network performance. Read More...
|
Information Technology |
India |
1127-1130 |