Many researchers have been studied for the automatic chromosome karyotype classification and analysis. For the automatic classify the each chromosome which is the image in microscope, it is necessary to process the sub-procedure, ie, image pre-processing, implementing karyotype classifier. The image pre-processing proceeded the each chromosome separation, the noise exception and the feature parameter extraction. The extracted morphological feature parameter were the centromeric index(C.I.), the relative length ratio(R.L.), and the relative area ratio(R.A.). In this paper, the fuzzy classifier was implemented for the human chromosome karyotype classification. The extracted morphological feature parameter were used in the input parameter of fuzzy classifier. We studied about the selection of the membership function for the optimal fuzzy classifier in each chromosome groups.
Moving objects database systems manage a set of moving objects which changes its locations and directions continuously. The traditional spatial indexing scheme is not suitable for the moving objects because it aimed to manage static spatial data. Because the location of moving object changes continuously, there is problem that expense that the existent spatial index structure reconstructs index dynamically is overladen. In this paper, we analyzed the insertion/deletion costs for processing the movement of objects. The results of our extensive experiments show that the Dynamic Hashing Index outperforms the original R-tree and the fixed grid typically by a big margin.
Redundant expressions must be eliminated in order to apply optimization for expressions in SSA Form from CTOC. This paper applied VN(Value Numbering) for this purpose. In order to carry out VN, SSAGraph must be first generated to maintain the information in the SSA Form, equivalent nodes must be found and SCC(Strongly Connected Component) generated. Equivalent nodes are assigned with an identical valnum through SCC. We could confirm elimiations for many nodes that added at SSA Form process after VN. The valnum can be applied in optimization and type inference.
In this thesis, we presents the component design and retrieval prototype system which can retrieve the most appropriate component for reuse system. If it used only an abject to reuse unit, it can not understand mutual reaction and can not sustain relationship between objects. Therefore, we porposed clustering component for reuse unit. It makes to understand relationship and reaction between object, so User can retrieve the proper component for reuse system. And porposed retrieval prototype system can select the correct things, it provide 3 facet retrieval environment against ambiguous retrieval definition.
In an Ubiquitous Environment, the growth of various services and equipment is forecasted to increase both the multicast users and diverse hacking attacks of the multicast key. Rapid increasing of multicast users and application security protocols reduce the performance of the Central key management system. Accordingly, We propose to elevate the functionality of the key management mechanism for greater efficiency and stability of the multicast services, in this paper. The existing key management mechanism comparison and simulation will analyze these problems. We propose the advanced SMKD (Secure Multicast Key Distribution) mechanism application of the small group and key length control, new security protocol by methods to solve these problems. The SMKD Model in this paper will help reduce loading the key distribution and encryption execution of a central key management system, and this model can also ensure stability to a central key management system by efficient key management.
While developing a complex reactive software system, it is very important to analyze the user requirement and reflect it to the developed system. Therefore understanding the need of users precisely and promptly is the key to the successful software system development. Among several requirement specification languages, message sequence charts (MSCs), also known as sequence diagrams in UML are the most widely used scenario notation. Live Sequence Charts (LSCs) are a variant of MSCs, characterized by its message abstraction facility and the modality of scenarios. In this paper, I define the formal semantics of LSC specification including the essential language constructs such as pre-charts, variables, assignment and conditions. The range of the formalized LSC language has been broadened, and the scope of the formalized semantics is much closer to the complete LSC specification.
If apply BPM that is a business process management tool to SCM sector, efficient process management and control are available. Also, BPM can execute integrating process that compose SCM effectively. These access method does to manage progress process of SCM process more efficiently and do monitoring. Also, It is can be establish plan about improvement of process analyzing process achievement result. Thus, in this paper, introduce this BPM into SCM environment. Also, SCM process presents plan that executes integration and improves business process effectively applying data mining technique.
In the field of the bioinformatics, it plays an important role in predicting functional information or structure information to search similar sequence in biological DB. Biological sequences have been increased dramatically since Human Genome Project. At this point, because the searching speed for the similar sequence is highly regarded as the important factor for predicting function or structure, the SMP(Sysmmetric Multi-Processors) computer or cluster is being used in order to improve the performance of searching time. As the method to improve the searching time of BLAST(Basic Local Alighment Search Tool) being used for the similarity sequence search, We suggest the nBLAST algorithm performing on the cluster environment in this paper. As the nBLAST uses the MPI(Message Passing Interface), the parallel library without modifying the existing BLAST source code, to distribute the query to each node and make it performed in parallel, it is possible to easily make BLAST parallel without complicated procedures such as the configuration. In addition, with the experiment performing the nBLAST in the 28 nodes of LINUX cluster, the enhanced performance according to the increase in the number of the nodes has been confirmed.
Classification technology is essential for fast retrieval in large multi-media database. This paper proposes a combining GA(Genetic Algorithm) and SVM(Support Vector Machine) model to fast retrieval. We used color and texture as feature vectors. We improved the retrieval accuracy by using proposed model which retrieves an optimal feature vector set in extracted feature vector sets. The first performance test was executed for the performance of color, texture and the feature vector combined with color and texture. The second performance test, was executed for performance of SVM and proposed algorithm. The results of the experiment, using the feature vector combined color and texture showed a good performance than a single feature vector and the proposed algorithm using hybrid method also showed a good performance than SVM algorithm.
This study aims to identify the effect that the perceived usefulness and perceived ease of use have on learner flow in e-learning community. Based on literature review and Technology Acceptance Model(TAM), a potential model and five hypotheses were suggested. Questionnaire was carried out among 62 members of one e-learning community for preparatory teacher. Cronbach alpha of the questionnaire was .88. The collected data were analyzed through correlation analysis and path analysis. The results of this research are as follows. Three hypotheses were adopted: Perceived usefulness will affect on attitudinal flow, Perceived ease of use will affect on attitudinal flow, and Attitudinal flow will affect on behavioral flow. Two hypotheses were rejected: Perceived usefulness will affect on perceived ease of use and Perceived ease of use will affect on behavioral flow. The model revised through the results of path analysis had good-fitness. That is, overall fit measures (RMSEA, CFI, NNFI), indexes that show the suitability of the model were quite good. Findings of this study suggested the important strategies for designing e-learning community in order to promote learner flow.
Type classification is a very needed step in recognizing huge character set language such as korean characters. Since most previous researches are based on the composition rule of Korean characters, it has been difficult to correctly classify composite vowel characters and problem space was not divided equally for the lack of classification of last consonant which is relatively bigger than other graphemes. In this paper, I propose a new type classification method in which horizontal vowel is extracted before vertical vowel and last consonants are further classified into one of five small groups based on horizontal projection profile. The new method uses 19 character types which is more stable than previous 6 types or 15 types. Through experiments on 1,000 frequently used character sets and 30,614 characters scanned from several magazines, I showed that the proposed method is more useful classifying Korean characters of huge set.
According to IT(information technology) industry progress, our life is gradually convenient. The proliferation of environmental pollution and the threat of diseases proportional to the progress comes to be high gradually. We must prevent dangerous diseases which threatens the life of the human. Or we are bumped against irrevocable serious situation. In spite of the situation, managing one's own health against modern busy lifestyle is very difficult. Therefore, we need to manage our health situation by using sensors based on ubiquitous IT environment. In this paper, we propose a diagnostic model which is able to diagnose and prevent a cerebrovascular disease based on ubiquitous technology. Also, as a step of implementing the u-health care diagnosis system, the diagnosis model of cerebrovascular disease plays an important role to decide a clinic result. In the future, by using this model, we may improve our welfare and health.
Flash memory has been increasingly used in handheld devices not only for data storage, but also for code storage. Because NAND flash memory only provides sequential access feature, a traditionally accepted solution to execute the program from NAND flash memory is shadowing. But, shadowing has significant drawbacks: increasing a booting time of the system and consuming severe DRAM space. Demand paging has obtained significant attention for program execution from NAND flash memory. But, one of the issues is that there has been no effort to bound demand paging cost in flash memory and to analyze the worst case performance of demand paging. For the worst case timing analysis of programs running from NAND flash memory, the worst case demand paging costs should be estimated. In this paper, we propose two different WCRT analysis methods considering demand paging costs, DP-Pessimistic and DP-Accurate, depending on the accuracy and the complexity of analysis. Also, we compare the accuracy between DP-Pessimistic and DP-Accurate by using the simulation.
This paper proposes a new directory-based cache coherence scheme which significantly reduces coherence traffic by omitting unnecessary write-backs to home nodes for migratory exclusively-modified data. The proposed protocol is well matched to such migratory data which are accessed in turn by processors, since write-backs to home nodes are never used for such migratory sharing. The simulation result shows that our protocol dramatically alleviate the coherence traffic, and the traffic reduction could also lead to shorten network latency and execution time.
Recently many wearable computers have been developed. But they still have many user interface problems from both an input and output perspective. This paper presents a wearable user interface based on EOG(electrooculogram) sensing circuit and marker recognition. In the proposed user interface, the EOG sensor circuit which tracks the movement of eyes by sensing the potential difference across the eye is used as a pointing device. Objects to manipulate are represented human readable markers. And the marker recognition system detects and recognize markers from the camera input image. When a marker is recognized, the corresponding property window and method window are displayed to the head mounted display. Users manipulate the object by selecting a property or a method item from the window. By using the EOG sensor circuit and the marker recognition system, we can manipulate an object with only eye movement in the wearable computing environment.
In this paper, we have implemented a ubiquitous healthcare system that can measure and check the blood pressure of human in anytime and anywhere. The implemented prototype are composed of blood pressure measurement terminal, data gathering base node, and medial information server. The implemented node constructs a sensor network using the Zigbee protocol and is ported the TinyOS. The data gathering base node is linux-based node that can transfer a sensed medial data through wireless LAN. And, the medical information server stores the processed medical data and can promptly notify the urgent status to the connected medical team. Through experiment, we confirmed the possibility of ubiquitous healthcare system based on sensor network using the Zigbee.
This paper presents an background edge generation based automatic algorithm for detection of moving objects under moving camera. Background image is generated by rotating the fixed the camera on the tripod horizontally, aligning and reorganizing this images. We develop an efficient approach for robust panoramic background edge generation as well as method of edge matching between input image and background image. We applied the proposed algorithm to real image sequences. The proposed method can be successfully realized in various monitoring systems like intrusion detection as well as video surveillance.
From the hazard which it prepares in the hazards increase which it follows in information demand augmentation of information technical development and the consumer from inside systematizing integrity and solubility of information technological resources, inside against a confidentiality. The control against information and a system and a data outside is demanded. From the dissertation which it sees demand function and the structure which do information technical risk management system development it will be able to manage the danger which it infiltrates with the root which is various overview in hazard necessity it investigated the inside and outside of the country instance in the center and it analyzed. And it plans the dangerous civil official integrated process model ultimately as against a hazards it will be able to prepare in the dictionary in order, it put the place objective which it induces.
This paper attacked the unknown DoS which mixed a DoS attack, Worm and the Trojan horse which used IP Source Address Spoofing and Smurf through the SYN Flooding way that UDP, ICMP, Echo, TCP Syn packet operated, the applications that used TCP/UDP in VoIP service networks. Define necessity of a Dynamic Update Engine for a prevention, and measure Miss traffic at RT statistics of inbound and outbound parts in case of designs of an engine at IPS regarding an Self-learning module and a statistical attack spread, and design a logic engine module. Three engines judge attack grades (Attack, Suspicious, Normal), and keep the most suitable filtering engine state through AND or OR algorithms at Footprint Lookup modules. A Real-Time Dynamic Engine and Filter updated protected VoIP service from DoS attacks, and strengthened Ubiquitous Security anger, and were turned out to be.
Computer Forensics functions by defending the effects and extracting the evidence of the side effects for production at the court. Has the faultlessness of the digital evidence been compromised during the investigation, a critical evidence may be denied or not even be presented at the trial. The presented monograph will deliberate the faultlessness-establishing chain procedures in disk forensics, system forensics, network forensics, mobile forensics and database forensics. Once the faultlessness is established by the methods proposed, the products of investigation will be adopted as a leading evidence. Moreover, the issues and alternatives in the reality of digital investigation are presented along with the actual computer forensics cases, hopefully contributing to the advances in computer digital forensics and the field research of information security.
By mistaking normal packets for harmful traffic, it may not offer service according to the intention of attacker with harmful traffic, because it is not easy to classify network traffic for normal service and it for DDoS(Distributed DoS) attack like the Internet worm. And in the IPv6 environment these researches on harmful traffic are weak. In this dissertation, hosts in the IPv6 environment are attacked by NETWIB and their attack traffic is monitored, then the statistical information of the traffic is obtained from MIB(Management Information Base) objects used in the IPv6. By adapting the ESM(Exponential Smoothing Method) to this information, a normal traffic boundary, i.e., a threshold is determined. Input traffic over the threshold is thought of as attack traffic.
RFID technology has been gradually expanding its application areas however studies on personal space infringement along with security are insufficient. This paper proposes a new security protocol access time interval scheme and RSA algorithm to analyze existing RFID security protocol and attempts to solve the problem of lightweight protocol. Information protection for two-way channels can be enforced through the proposed protocol and other issues of sniffing and man-in-the-middle attacks can be solved by applying a mutual certification technique application among tag readers.
This study proposed a framework to clarify a viewpoint of quality problems, and to consider reliance, of context information in ubiquitous computing environments. The framework is structured as a sequence of steps in measuring the quality of context information. The first step in measuring the quality of context information is to determine users of the context information. This is important because the type of users or applications determines the type of context information and thus the methods of measuring the quality dimensions and the thresholds for evaluating the quality of context information. The other steps include methods for measuring each quality dimensions to allow quantitative evaluation of quality, establishing acceptable quality targets. We selected accuracy, completeness, up-to-dateness, access security, and representation as quality dimensions and proposed their measurement methods and concrete procedures. We enabled objective evaluation of quality level through proposal of methods suitable to quality measurement of context information.
As the wireless network is popular and expanded, it is necessary to development the IDS(Intrusion Detection System)/Filtering System from the malicious wireless traffic. We propose the W-Sensor SW which detects the malicious wireless traffic and the W-TMS system which filters the malicious traffic by W-Sensor log in this paper. It is efficient to detect the malicious traffic and adaptive to change the security rules rapidly by the proposed W-Sensor SW. The designed W-Sensor by installing on a notebook supports the mobility of IDS in compare with the existed AP based Sensor.
An Ad hoc network, differently from wired networks, is a self organized network of mobile nodes in wireless environments. In this kind of routing and operation environments, link breaks occur frequently. Knowing positions of nodes may prevent or recover this type of phenomenon. Typically, GPS is used to detect position of nodes but high cost and complexity of construction limit applications. In this paper, we propose and design a method of construction nodes’position map using only information of all nodes without GPS. The proposed method complements and overcomes previous system with software solutions. Hence, construction and operation are simple, and can decrease cost of construction expenses.
MPEG-21’s digital item adaptation technology becomes a new way for universal multimedia access. It needs transcoder to change media resource’s format and so on according to delivery context. Then, the use of heavy transcoder with various transcoding functions integrated into one altogether is so complicated and difficult in supporting universal multimedia access. Unit transcoder is useful is to resolve this question, in which a transcoder has only one transcoding function. This requires considering how to compose a set of unit transcoders. Thus, given a set for end-to-end different service quality pairs according to the character of application as defined by user, this study suggests how to compose complete unit transcoders that can always create one and more transcoding path(s) for each pair in the set. This method has a question of creating too many transcoding paths for each pair of end-to-end different service quality. Thus, this study also suggests the algorithm that generates minimum unit transcoder sets to support multimedia adaptation with minimum unit transcoder. The algorithm suggested was implemented into multimedia stream engine, and this paper describes the results of experiment for this algorithm.
Nowadays, the administration is strongly driving self-innovation including public sector to solve problems continuously pointed out as issues of public sector such as the low management achievement, the low productivity and costliness structure, the low ability coping with market changes and ineffectiveness in managing organization. The core of such renovation is the systematic performance management. Provincial public enterprises are also required to furnish system that can measure management results accurately and make payment according to measured results under autonomous management system to raise management efficiency and strengthen competitive power. In this paper, we design framework for measuring management results of provincial public enterprises based on Balanced Scorecard and show the example applied to Gwangju Metropolitan City Environmental Installations Corp(GEIC).
This study is to investigate how mobile internet service users recognize that Wibro service providers should be equipped with what type of organizational capabilities. Especially, this study, in exploratory point of view, examines the recognition is differed by characteristics of Wibro service users. This study also is to suggest a few of research propositions regarding the relationships between organizational capabilities and characteristics of Wibro service users from various statistical analyses. To accomplish these research purposes, this study defined characteristics of Wibro service users (gender, age, job) and organizational capabilities of Wibro service providers (accounting/finance capabilities, production/service capabilities, marketing/sales capabilities, research and development/technology capabilities) based on the review of past mobile internet service related studies. And then, this study performed not only questionnaire survey on latent Wibro service users but also various statistical analyses.
In order to perform a probabilistic safety assessment (PSA), it requires a large number of data for various fields. And the quality of a PSA results have become more important thing of the risk assessment . As part of enhancing the PSA quality, Korea Atomic Energy Research Institute is developing a full power Human Reliability Analysis (HRA) calculator to manage human failure events (HFEs) and to calculate the diagnosis human error probabilities and execution human error probabilities. This paper introduces the development process and an overview of a standard HRA method for nuclear power plants. The study was carried out in three stages; 1) development of the procedures and rules for a standard HRA method, 2) design of a system structure, 3) development of the HRA calculator.
We propose a new expert system for recovering the broken fragments of relics into an original form using computer graphics and image processing. This paper presents a system with an application to tombstones objects of flat plane with letters carved in for assembling the fragments by placing their respective fragments in the right position. The matching process contains three sub-processes: aligning the front and letters of an object, identifying the matching directions, and determining the detailed matching positions. We apply least squares fitting, vector inner product, and geometric and RGB errors to the matching process. It turned out that 2-D translations via fragments-alignment enable us to save the computational load significantly. Based on experimental results from the damaged cultural fragments, the performance of the proposed method is illustrated.
This paper proposes a method to detect fall action by using stereo images to recognize emergency situation. It uses 3D information to extract the visual information for learning and testing. It uses HMM(Hidden Markov Model) as a recognition algorithm. The proposed system extracts background images from two camera images. It extracts a moving object from input video sequence by using the difference between input image and background image. After that, it finds the bounding rectangle of the moving object and extracts 3D information by using calibration data of the two cameras. We experimented to the recognition rate of fall action with the variation of rectangle width and height and that of 3D location of the rectangle center point. Experimental results show that the variation of 3D location of the center point achieves the higher recognition rate than the variation of width and height.
Many researches are going on with regard to issues and problems related to mobile database systems, which are caused by the weak connectivity of wireless networks, the mobility and the portability of mobile clients. Mobile computing satisfies user's demands for convenience and performance to use information at any time and in any place, but it has many problems to be solved in the aspect of data management. The purpose of our study is to design Mobile Continuous Query Processing System(MCQPS) to solve problems related to database hoarding, the maintenance of shared data consistency and the optimization of logging, which are caused by the weak connectivity and disconnection of wireless networks inherent in mobile database systems under mobile clientserver environments. We proved the superiority of the proposed MCQPS by comparing its performance to the CIS(Client-Intercept-Srever) model. In Addition, we experiment on proposed index structure and methodology in various methods.