Many techniques have been proposed to store efficiently and query XML data. One way achieving this goal is using relational database by transforming XML data into relational format. But most researches only transformed content and structure of XML schema. Although they transformed semantic constrainment of XML schema, they did not all of semantics. In this paper, we propose a systematic technique for extracting semantic constrainment from XML schema and storing method when the extracting result is transformed into relational schema without any lost of semantic constrainment. The transforming algorithm is used for extracting and storing semantic constrainment from XML schema and it shows how extracted information is stored according to schema notation. Also it provides semantic knowledges that are needed to be confirmed during the transformation to ensure a correct relation schema. The technique can reduce storage redundancy and can keep up content and structure with integrity constraints.
It's a trend that the proportion of using an internet messenger among on-line communication methods is getting increased. However, there are not many applications which efficiently utilize these messenger communication data. Messenger communication data have specific characteristics that reflect the user's linguistic habits. The linguistic habits are revealed through frequently used words and emoticons, and user's emotions can be grasped by these. This paper proposes the method that efficiently classifies the emotions of a messenger user using frequently used words or symbols. The emotion classifier from repeated experiments achieves high accuracy of more than 95%.
Prediction problem of the time-series data has been a research issue for a long time among many researchers and a number of methods have been proposed in the literatures. In this paper, a method is proposed that similarities among time-series data are examined by use of Hidden Markov Model and Likelihood and future direction of the data movement is determined. Query sequence is modeled by Hidden Markov Modeling and then the model is examined over the pre-recorded time-series to find the subsequence which has the greatest similarity between the model and the extracted subsequence. The similarity is evaluated by likelihood. When the best subsequence is chosen, the next portion of the subsequence is used to predict the next phase of the data movement. A number of experiments with different parameters have been conducted to confirm the validity of the method. We used KOSPI to verify suggested method.
Sequential pattern mining, which discovers frequent subsequences as patterns in a sequence database, is an important data mining problem with broad applications. Since a sequential pattern in DNA sequences can be a motif, we studied to find sequential patterns in DNA sequences. Most previously proposed mining algorithms follow the exact matching with a sequential pattern definition. They are not able to work in noisy environments and inaccurate data in practice. Theses problems occurs frequently in DNA sequences which is a biological data. We investigated approximate matching method to deal with those cases. Our idea is based on the observation that all occurrences of a frequent pattern can be classified into groups, which we call approximated pattern. The existing PrefixSpan algorithm can successfully find sequential patterns in a long sequence. We improved the PrefixSpan algorithm to find approximate sequential patterns. The experimental results showed that the number of repeats from the proposed method was 5 times more than that of PrefixSpan when the pattern length is 4.
Recently, as the quality of multimedia data gets higher, multimedia servers require larger storage capacity and higher I/O bandwidth. In these large scale multimedia servers, the load-unbalance problem among disks due to the difference in access frequencies to multimedia objects according to their popularities significantly affects the system performance. To address this problem, many data replication schemes have been proposed. In this paper, we propose a novel data migration/replication scheme to provide better storage efficiency and performance than the dynamic data replication scheme which is typical data replication scheme employed in multimedia servers. This scheme can reduce the additional storage space required for replication, which is a major defect of replication schemes, by decreasing the number of copies per object. The scheme can also increase the number of concurrent users by increasing the caching effect due to the reduced lengths of the intervals among requests for each object.
In this paper, we propose an adult images detection algorithm using a mean intensity filter and an improved 2D Hough Transform. This paper is composed of three major steps including a training step, a recognition step, and a verification step. The training step generates a mean nipple variance filter that will be used for detecting nipple candidate regions in the recognition step. To make the mean variance filter, we converts an input color image into a gray scale image and normalize it, and make an average intensity filter for nipple areas. The recognition step first extracts edge images and finds connected components, and decides nipple candidate regions by considering the ratio of width and height of a connected component. It then decides final nipple candidates by calculating the similarity between the learned nipple average intensity filter and the nipple candidate areas. Also, it detects breast lines of an input image through the improved 2D Hough transform. The verification step detects breast areas and identifies adult images by considering the relations between nipple candidate regions and locations of breast lines.
In this paper, we design and implement verification and interface application for interactive data broadcasting middleware. This application implements ACAP and OCAP verification item according to their types (format, protocol, resource, presentation). Therefore, using this application, we can verify whether digital settop-boxes used in digital terrestrial television and digital cable television conforms to the ACAP and OCAP standards. In this paper, we evaluate our proposed application using TVPLUSi™ verifier which can verify interactive TV application in real broadcasting environment. Through performance evaluation, we show that the DTB-H650F set-top-box supports OCAP and ACAP standard 80% and 95%, respectively.
The development of IT technology, Internet popularity is increasing geometrically. However, as its side effect, the intrusion behaviors such as information leakage for key system and infringement of computation network etc are also increasing fast. The attack traffic detection method which is suggested in this study utilizes the Snort, traditional NIDS, filters the packet with false positive among the detected attack traffics using Nmap information. Then, it performs the secondary filtering using nessus vulnerability information and finally performs correlation analysis considering appropriateness of management system, severity of signature and security hole so that it could reduce false positive alarm message as well as minimize the errors from false positive and as a result, it raised the overall attack detection results.
In this paper, an improved backoff algorithm is proposed by supplementing a multiple of persistence factor for IEEE 802.11 Wireless LAN MAC. This algorithm is proposed to complement the shortcomings of the conventional BEB (Binary Exponential Backoff) algorithm which is used for retransmission to control a new contention window in DCF/EDCF MAC. In channel utilization, collision rate and Goodput viewpoint, we analysis the improved backoff algorithm and compared the result with that of the conventional algorithm. In this result, we showed that the performance for PFA backoff algorithm is 10% higher than that for the conventional BEB algorithm when the number of station is 40.
The centralized conference architecture has a restriction in scalability due to the performance reduction as the number of conference participants increases. To solve this problem several distributed conference architectures have been studied recently. In these architectures new conference servers are added dynamically to the conference environment. In this paper, We have proposed a new conference information data model which can be used in these distributed conference architectures. In our newly proposed conference information data model, several components has been added for exchanging conference information between primary conference server and multiple secondary conference servers. We also proposed a procedure of conference information exchange between these conference servers. And the management of conference informations and SIP(Session Initiation Protocol) notifications to the total conference participants can be processed distributedly with these conference servers, therefore the load to the primary conference server can be decreased by using this method. The performance of our proposed model has been evaluated by experiments.
Environmental monitoring applications measure temporature, humidity, and pollution degrees of large areas periodically and are essential for ubiquitous society. In this paper, we propose a sensor network MAC protocol that is applicable to environmental monitoring applications. The proposing MAC protocol has scalability by constructing multiple groups of sensor nodes as in SMAC protocol. Differently from SMAC protocol, however, ours have hierarchical structure between adjacent groups. Data transmission schedules are efficient since lower groups are synchronized to higher groups. Thus, the end-to-end delays and energy consumption can be reduced due to sequential transmission schedules. But since the nodes within the same group are synchronized to themselves, they have good adaptability and scalability compared to existing hierarchical approaches such as DMAC. We show by simulations that the proposing MAC protocol is superior to SMAC for environmental monitoring applications.
In MANET(Mobile Ad-Hoc Network), providing security to routing has been a significant issue recently. Existing studies, however, focused on either of secure routing or packet itself where malicious operations occur. In this paper, we propose SRPPnT(A Secure Routing Protocol in MANET based on Malicious Pattern of Node and Trust Level) that consider both malicious behavior on packet and secure routing. SRPPnT is identify the node where malicious activities occur for a specific time to compose trust levels for each node, and then to set up a routing path according to the trust level obtained. Therefore, SRPPnT is able to make efficient countermeasures against malicious operations. SRPPnT is based on AODV(Ad-Hoc On-Demand Distance Vector Routing). The proposed SRPPnT, from results of the NS-2 network simulation, shows a more prompt and accurate finding of malicious nodes than previous protocols did, under the condition of decreased load of networks and route more securely.
Existing proposals on route optimization for nested Network Mobility(NEMO) have a problem that it is difficult to optimize a route promptly in an environment where a MR moves frequently. Also, they have L3 handoff latency as well as route optimization latency until an optimized route is formed. In this paper, we propose a L3 handoff scheme that supports fast route optimization for nested NEMO without any additional optimization procedure. To achieve this, our proposed scheme is designed to include a procedure that an AR acquires address informations of a MR. After receiving binding update message from the MR, the AR performs the binding update procedure with the MR's HA on behalf of the MR. Packets are delivered to the AR only passing by the MR's HA after a bi-directional tunnel is formed between the AR and the HA. The result of our performance evaluation has shown that the proposed scheme could provide excellent performance compared with the RRH and the ONEMO.
This paper has proposed Flow Holding Time based Link State Update(LSU) Algorithm that has minimized the LSU message overhead in QoS routing and has not had a strong influence on the depreciation of QoS routing performance. We apply a flow holding time in order to decrease the number of LSU message. We have evaluated the performance of the proposed model and the existing algorithms on MCI simulation network using the performance metric as the QoS routing blocking rate and the mean update rate per link, it thus appears that we have verified the performance of this algorithm.
Wireless Sensor Networks(WSN) can be applied to various industry fields and environment analysis fields with the progress of various sensor technologies. Also WSN help automatically monitoring of sensor nodes installed at wide area. Especially, the tiny sensor nodes recently developed for the environment analysis require much more electronic power. The reasons are the measurable fields are departmentalized and the more detailed measuring fields are created by the development of various materials and applications. Furthermore, the sensor nodes operated by small batteries for the fields require low cost and low power consumption in wireless networks technology. The power efficiency is the most important factor for the WSN life time. Because the sensor nodes are installed at wide area and hard to recover. This paper proposes the WSN algorithm is applied sensor node that has low power consumption and efficiency measurement.
Korean WiBro becomes international standard to IEEE 802.16e, and We are carrying out a WiBro network business from capital regions. We executed eavesdropping about voices and messenger program and the VoIP which frequently happened in WiBro networks at these papers. We have a lot in common with the Wireshark which is a packet collection and an analyzer, and We execute eavesdropping, and We reproduce eavesdropping data with bases to a SIP, H.263, TCP, UDP protocol through packets. In time of a copy of a packet negative the VoIP which verify time with bases, and was eavesdropped on integrity packet and a X-Lite call record, be matched that a packet is counterfeit forgery did not work, and We demonstrate, and verify integrity. The data which integrity was verified put in a seaming envelope, and we prepare so as it is to a liver of investigator, and execute, and to be able to do use to proof data after seaming in courts in order to utilize as criminal investigation data.
In this paper, we propose a distributed model that recognize ADLs of human can be occurred in daily living places. We collect and analyze user's environmental, location or activity information by simple sensor attached home devices or utensils. Based on these information, we provide a lifecare services by inferring the user's life pattern and health condition. But in order to provide a lifecare services well-refined activity recognition data are required and without enough inferred information it is very hard to build an ADL activity recognition model for high-level situation awareness. The sequence that generated by sensors are very helpful to infer the activities so we utilize the sequence to analyze an activity pattern and propose a distributed linear time inference algorithm. This algorithm is appropriate to recognize activities in small area like home, office or hospital. For performance evaluation, we test with an open data from MIT Media Lab and the recognition result shows over 75% accuracy.
The meaning of Database in order to manage the data which is huge in the meeting of the record which logically had become the fire tube or file ' efficiently 'is widely used from the place which controls a many double meaning data. Like this data base it creates, it manages, the programs which send an answer back according to demand of the user as DBMS it calls. Like this it will be able to grasp the quality level of the data base software product which is important index from the research which index it buys it defined. Also, in order to produce the result of index it selects the collection item which is necessary and collection and analysis it leads and what kind of defect types occur substantially mainly, and it confirmed and the test and evaluation model in about data base software and a tentative instance it developed it analyzed.
We extracted protein signal delivery path from protein interaction data, using location information and weight of protein. We obtained the protein interaction data by experimenting in two-hybrid system using Yeast. We simulated function's data of Hypotonic Shock comparing to signal delivery path provided in KEGG from the results. We measured process running period as well. In future, this research can be key to discover the origin of various genetic diseases and develop treatment.
Both On-Line Analytical Processing (OLAP) data cubes and Statistical Databases (SDBs) deal with multidimensional data sets, and both are concerned with statistical summarizations over the dimensions of the data sets. However, there is a distinction between the two that can be made. While SDBs are usually derived from other base data, OLAP data cubes often represent directly the base data. In other word, the base data of SDBs are the macro-data, whereas the core cubiod data in OLAP data cubes are the micro-data.
The base table in OLAP is used to populate the data cube with values of the measure attribute, and each record in the base tables is used to populate a cell of the core cuboid. The fact that OLAP data cubes mostly represent the micro-data may make some records be absent in the base table. Some cells of the core cuboid remain empty, if corresponding records are absent in the base table.
Wang and others proposed a method for securing OLAP data cubes against privacy breaches. They assert that the proposed method does not depend on specific types of aggregation functions. In this paper, however, it is found that their assertion on aggregate functions is wrong whenever any cell of the core cuboid remains empty. The objective of this study is to design an inference control process in OLAP data cubes which rectifying Wang's error.
This study aims to speculate the development of sponsors of social welfare facilities through marketing mix. This is the time when the logics of economy is applied for the area of social welfare. Since July 2008 when the long-term care insurance system for the elderly executed, a number of social welfare facilities which depend on governmental budget have competed for various resources, but most of them have still had serious financial difficulties. As a supplementary method to overcome the financial problem, we have to develop how to excavate and manage sponsors who fit for conditions of the facilities to secure financial resources. To develop sponsors of social welfare facilities through marketing mix in this aspect, this study is to present strategic methods for sponsor development in seven factors developed by Fine(1992): product, price, promotion, place, producer, purchaser, and probing.
RThe main object of this study is to stipulate the relation between e-learning characteristics and e-learner's scholastic performance through the integrated study model of perspective of educational technology and information technology. Using e- learning system quality, e-learning contents characteristics and interaction as independent variable, e-learner's scholastic performance as dependent variable and learning motivation as mediator, this study has examined the relationship among these variables. Two hundreds and twelve undergraduates in cyber university participated in the survey and filled out questionnaires related to this study. The main results are as followed. First, content's quality, technical quality and the support of school affairs have a significant effect on the e-learner's scholastic performance. Second, Learning motivation plays a partial mediating role in the relationship between e-learning characteristics and e-learner's scholastic performance. The meaningful implication of this study is that to improve e-learner's scholastic performance, we have to offer e-learners more customized various learning plans, learning contents and interaction between e-learners and e-learning systems.