Author: Zlatan Dragisic
Publisher: Linköping University Electronic Press
ISBN: 9176855228
Category : Computers
Languages : en
Pages : 88
Book Description
The World Wide Web contains large amounts of data, and in most cases this data has no explicit structure. The lack of structure makes it difficult for automated agents to understand and use such data. A step towards a more structured World Wide Web is the Semantic Web, which aims at introducing semantics to data on the World Wide Web. One of the key technologies in this endeavour are ontologies, which provide a means for modeling a domain of interest and are used for search and integration of data. In recent years many ontologies have been developed. To be able to use multiple ontologies it is necessary to align them, i.e., find inter-ontology relationships. However, developing and aligning ontologies is not an easy task and it is often the case that ontologies and their alignments are incorrect and incomplete. This can be a problem for semantically-enabled applications. Incorrect and incomplete ontologies and alignments directly influence the quality of the results of such applications, as wrong results can be returned and correct results can be missed. This thesis focuses on the problem of completing ontologies and ontology networks. The contributions of the thesis are threefold. First, we address the issue of completing the is-a structure and alignment in ontologies and ontology networks. We have formalized the problem of completing the is-a structure in ontologies as an abductive reasoning problem and developed algorithms as well as systems for dealing with the problem. With respect to the completion of alignments, we have studied system performance in the Ontology Alignment Evaluation Initiative, a yearly evaluation campaign for ontology alignment systems. We have also addressed the scalability of ontology matching, which is one of the current challenges, by developing an approach for reducing the search space when generating the alignment.Second, high quality completion requires user involvement. As users' time and effort are a limited resource we address the issue of limiting and facilitating user interaction in the completion process. We have conducted a broad study of state-of-the-art ontology alignment systems and identified different issues related to the process. We have also conducted experiments to assess the impact of user errors in the completion process. While the completion of ontologies and ontology networks can be done at any point in the life-cycle of ontologies and ontology networks, some of the issues can be addressed already in the development phase. The third contribution of the thesis addresses this by introducing ontology completion and ontology alignment into an existing ontology development methodology.
Completion of Ontologies and Ontology Networks
Author: Zlatan Dragisic
Publisher: Linköping University Electronic Press
ISBN: 9176855228
Category : Computers
Languages : en
Pages : 88
Book Description
The World Wide Web contains large amounts of data, and in most cases this data has no explicit structure. The lack of structure makes it difficult for automated agents to understand and use such data. A step towards a more structured World Wide Web is the Semantic Web, which aims at introducing semantics to data on the World Wide Web. One of the key technologies in this endeavour are ontologies, which provide a means for modeling a domain of interest and are used for search and integration of data. In recent years many ontologies have been developed. To be able to use multiple ontologies it is necessary to align them, i.e., find inter-ontology relationships. However, developing and aligning ontologies is not an easy task and it is often the case that ontologies and their alignments are incorrect and incomplete. This can be a problem for semantically-enabled applications. Incorrect and incomplete ontologies and alignments directly influence the quality of the results of such applications, as wrong results can be returned and correct results can be missed. This thesis focuses on the problem of completing ontologies and ontology networks. The contributions of the thesis are threefold. First, we address the issue of completing the is-a structure and alignment in ontologies and ontology networks. We have formalized the problem of completing the is-a structure in ontologies as an abductive reasoning problem and developed algorithms as well as systems for dealing with the problem. With respect to the completion of alignments, we have studied system performance in the Ontology Alignment Evaluation Initiative, a yearly evaluation campaign for ontology alignment systems. We have also addressed the scalability of ontology matching, which is one of the current challenges, by developing an approach for reducing the search space when generating the alignment.Second, high quality completion requires user involvement. As users' time and effort are a limited resource we address the issue of limiting and facilitating user interaction in the completion process. We have conducted a broad study of state-of-the-art ontology alignment systems and identified different issues related to the process. We have also conducted experiments to assess the impact of user errors in the completion process. While the completion of ontologies and ontology networks can be done at any point in the life-cycle of ontologies and ontology networks, some of the issues can be addressed already in the development phase. The third contribution of the thesis addresses this by introducing ontology completion and ontology alignment into an existing ontology development methodology.
Publisher: Linköping University Electronic Press
ISBN: 9176855228
Category : Computers
Languages : en
Pages : 88
Book Description
The World Wide Web contains large amounts of data, and in most cases this data has no explicit structure. The lack of structure makes it difficult for automated agents to understand and use such data. A step towards a more structured World Wide Web is the Semantic Web, which aims at introducing semantics to data on the World Wide Web. One of the key technologies in this endeavour are ontologies, which provide a means for modeling a domain of interest and are used for search and integration of data. In recent years many ontologies have been developed. To be able to use multiple ontologies it is necessary to align them, i.e., find inter-ontology relationships. However, developing and aligning ontologies is not an easy task and it is often the case that ontologies and their alignments are incorrect and incomplete. This can be a problem for semantically-enabled applications. Incorrect and incomplete ontologies and alignments directly influence the quality of the results of such applications, as wrong results can be returned and correct results can be missed. This thesis focuses on the problem of completing ontologies and ontology networks. The contributions of the thesis are threefold. First, we address the issue of completing the is-a structure and alignment in ontologies and ontology networks. We have formalized the problem of completing the is-a structure in ontologies as an abductive reasoning problem and developed algorithms as well as systems for dealing with the problem. With respect to the completion of alignments, we have studied system performance in the Ontology Alignment Evaluation Initiative, a yearly evaluation campaign for ontology alignment systems. We have also addressed the scalability of ontology matching, which is one of the current challenges, by developing an approach for reducing the search space when generating the alignment.Second, high quality completion requires user involvement. As users' time and effort are a limited resource we address the issue of limiting and facilitating user interaction in the completion process. We have conducted a broad study of state-of-the-art ontology alignment systems and identified different issues related to the process. We have also conducted experiments to assess the impact of user errors in the completion process. While the completion of ontologies and ontology networks can be done at any point in the life-cycle of ontologies and ontology networks, some of the issues can be addressed already in the development phase. The third contribution of the thesis addresses this by introducing ontology completion and ontology alignment into an existing ontology development methodology.
NeOn Methodology for Building Ontology Networks
Author: Mari Carmen Suárez-Figueroa
Publisher: Ios PressInc
ISBN: 9781614991151
Category : Computers
Languages : en
Pages : 292
Book Description
A new ontology development paradigm has started its emphasis lies on the reuse and possible subsequent reengineering of knowledge resources, on the collaborative and argumentative ontology development, and on the building of ontology networks this new trend is the opposite of building new ontologies from scratch. To help ontology developers in this new paradigm, it is important to provide strong methodological support.However, up to date, there are no methodological approaches that help ontology developers to build large ontologies embedded in ontology networks in complex settings where distributed teams could
Publisher: Ios PressInc
ISBN: 9781614991151
Category : Computers
Languages : en
Pages : 292
Book Description
A new ontology development paradigm has started its emphasis lies on the reuse and possible subsequent reengineering of knowledge resources, on the collaborative and argumentative ontology development, and on the building of ontology networks this new trend is the opposite of building new ontologies from scratch. To help ontology developers in this new paradigm, it is important to provide strong methodological support.However, up to date, there are no methodological approaches that help ontology developers to build large ontologies embedded in ontology networks in complex settings where distributed teams could
System-Level Analysis and Design under Uncertainty
Author: Ivan Ukhov
Publisher: Linköping University Electronic Press
ISBN: 9176854264
Category :
Languages : en
Pages : 194
Book Description
One major problem for the designer of electronic systems is the presence of uncertainty, which is due to phenomena such as process and workload variation. Very often, uncertainty is inherent and inevitable. If ignored, it can lead to degradation of the quality of service in the best case and to severe faults or burnt silicon in the worst case. Thus, it is crucial to analyze uncertainty and to mitigate its damaging consequences by designing electronic systems in such a way that they effectively and efficiently take uncertainty into account. We begin by considering techniques for deterministic system-level analysis and design of certain aspects of electronic systems. These techniques do not take uncertainty into account, but they serve as a solid foundation for those that do. Our attention revolves primarily around power and temperature, as they are of central importance for attaining robustness and energy efficiency. We develop a novel approach to dynamic steady-state temperature analysis of electronic systems and apply it in the context of reliability optimization. We then proceed to develop techniques that address uncertainty. The first technique is designed to quantify the variability of process parameters, which is induced by process variation, across silicon wafers based on indirect and potentially incomplete and noisy measurements. The second technique is designed to study diverse system-level characteristics with respect to the variability originating from process variation. In particular, it allows for analyzing transient temperature profiles as well as dynamic steady-state temperature profiles of electronic systems. This is illustrated by considering a problem of design-space exploration with probabilistic constraints related to reliability. The third technique that we develop is designed to efficiently tackle the case of sources of uncertainty that are less regular than process variation, such as workload variation. This technique is exemplified by analyzing the effect that workload units with uncertain processing times have on the timing-, power-, and temperature-related characteristics of the system under consideration. We also address the issue of runtime management of electronic systems that are subject to uncertainty. In this context, we perform an early investigation of the utility of advanced prediction techniques for the purpose of finegrained long-range forecasting of resource usage in large computer systems. All the proposed techniques are assessed by extensive experimental evaluations, which demonstrate the superior performance of our approaches to analysis and design of electronic systems compared to existing techniques.
Publisher: Linköping University Electronic Press
ISBN: 9176854264
Category :
Languages : en
Pages : 194
Book Description
One major problem for the designer of electronic systems is the presence of uncertainty, which is due to phenomena such as process and workload variation. Very often, uncertainty is inherent and inevitable. If ignored, it can lead to degradation of the quality of service in the best case and to severe faults or burnt silicon in the worst case. Thus, it is crucial to analyze uncertainty and to mitigate its damaging consequences by designing electronic systems in such a way that they effectively and efficiently take uncertainty into account. We begin by considering techniques for deterministic system-level analysis and design of certain aspects of electronic systems. These techniques do not take uncertainty into account, but they serve as a solid foundation for those that do. Our attention revolves primarily around power and temperature, as they are of central importance for attaining robustness and energy efficiency. We develop a novel approach to dynamic steady-state temperature analysis of electronic systems and apply it in the context of reliability optimization. We then proceed to develop techniques that address uncertainty. The first technique is designed to quantify the variability of process parameters, which is induced by process variation, across silicon wafers based on indirect and potentially incomplete and noisy measurements. The second technique is designed to study diverse system-level characteristics with respect to the variability originating from process variation. In particular, it allows for analyzing transient temperature profiles as well as dynamic steady-state temperature profiles of electronic systems. This is illustrated by considering a problem of design-space exploration with probabilistic constraints related to reliability. The third technique that we develop is designed to efficiently tackle the case of sources of uncertainty that are less regular than process variation, such as workload variation. This technique is exemplified by analyzing the effect that workload units with uncertain processing times have on the timing-, power-, and temperature-related characteristics of the system under consideration. We also address the issue of runtime management of electronic systems that are subject to uncertainty. In this context, we perform an early investigation of the utility of advanced prediction techniques for the purpose of finegrained long-range forecasting of resource usage in large computer systems. All the proposed techniques are assessed by extensive experimental evaluations, which demonstrate the superior performance of our approaches to analysis and design of electronic systems compared to existing techniques.
Ontology-Based Information Retrieval for Healthcare Systems
Author: Vishal Jain
Publisher: John Wiley & Sons
ISBN: 1119641381
Category : Computers
Languages : en
Pages : 384
Book Description
With the advancements of semantic web, ontology has become the crucial mechanism for representing concepts in various domains. For research and dispersal of customized healthcare services, a major challenge is to efficiently retrieve and analyze individual patient data from a large volume of heterogeneous data over a long time span. This requirement demands effective ontology-based information retrieval approaches for clinical information systems so that the pertinent information can be mined from large amount of distributed data. This unique and groundbreaking book highlights the key advances in ontology-based information retrieval techniques being applied in the healthcare domain and covers the following areas: Semantic data integration in e-health care systems Keyword-based medical information retrieval Ontology-based query retrieval support for e-health implementation Ontologies as a database management system technology for medical information retrieval Information integration using contextual knowledge and ontology merging Collaborative ontology-based information indexing and retrieval in health informatics An ontology-based text mining framework for vulnerability assessment in health and social care An ontology-based multi-agent system for matchmaking patient healthcare monitoring A multi-agent system for querying heterogeneous data sources with ontologies for reducing cost of customized healthcare systems A methodology for ontology based multi agent systems development Ontology based systems for clinical systems: validity, ethics and regulation
Publisher: John Wiley & Sons
ISBN: 1119641381
Category : Computers
Languages : en
Pages : 384
Book Description
With the advancements of semantic web, ontology has become the crucial mechanism for representing concepts in various domains. For research and dispersal of customized healthcare services, a major challenge is to efficiently retrieve and analyze individual patient data from a large volume of heterogeneous data over a long time span. This requirement demands effective ontology-based information retrieval approaches for clinical information systems so that the pertinent information can be mined from large amount of distributed data. This unique and groundbreaking book highlights the key advances in ontology-based information retrieval techniques being applied in the healthcare domain and covers the following areas: Semantic data integration in e-health care systems Keyword-based medical information retrieval Ontology-based query retrieval support for e-health implementation Ontologies as a database management system technology for medical information retrieval Information integration using contextual knowledge and ontology merging Collaborative ontology-based information indexing and retrieval in health informatics An ontology-based text mining framework for vulnerability assessment in health and social care An ontology-based multi-agent system for matchmaking patient healthcare monitoring A multi-agent system for querying heterogeneous data sources with ontologies for reducing cost of customized healthcare systems A methodology for ontology based multi agent systems development Ontology based systems for clinical systems: validity, ethics and regulation
An Introduction to Ontology Engineering
Author: C. Maria Keet
Publisher:
ISBN: 9781848902954
Category : Computer software
Languages : en
Pages : 344
Book Description
An Introduction to Ontology Engineering introduces the student to a comprehensive overview of ontology engineering, and offers hands-on experience that illustrate the theory. The topics covered include: logic foundations for ontologies with languages and automated reasoning, developing good ontologies with methods and methodologies, the top-down approach with foundational ontologies, and the bottomup approach to extract content from legacy material, and a selection of advanced topics that includes Ontology-Based Data Access, the interaction between ontologies and natural languages, and advanced modelling with fuzzy and temporal ontologies. Each chapter contains review questions and exercises, and descriptions of two group assignments are provided as well. The textbook is aimed at advanced undergraduate/postgraduate level in computer science and could fi t a semester course in ontology engineering or a 2-week intensive course. Domain experts and philosophers may fi nd a subset of the chapters of interest, or work through the chapters in a different order. Maria Keet is an Associate Professor with the Department of Computer Science, University of Cape Town, South Africa. She received her PhD in Computer Science in 2008 at the KRDB Research Centre, Free University of Bozen-Bolzano, Italy. Her research focus is on knowledge engineering with ontologies and Ontology, and their interaction with natural language and conceptual data modelling, which has resulted in over 100 peer-reviewed publications. She has developed and taught multiple courses on ontology engineering and related courses at various universities since 2009.
Publisher:
ISBN: 9781848902954
Category : Computer software
Languages : en
Pages : 344
Book Description
An Introduction to Ontology Engineering introduces the student to a comprehensive overview of ontology engineering, and offers hands-on experience that illustrate the theory. The topics covered include: logic foundations for ontologies with languages and automated reasoning, developing good ontologies with methods and methodologies, the top-down approach with foundational ontologies, and the bottomup approach to extract content from legacy material, and a selection of advanced topics that includes Ontology-Based Data Access, the interaction between ontologies and natural languages, and advanced modelling with fuzzy and temporal ontologies. Each chapter contains review questions and exercises, and descriptions of two group assignments are provided as well. The textbook is aimed at advanced undergraduate/postgraduate level in computer science and could fi t a semester course in ontology engineering or a 2-week intensive course. Domain experts and philosophers may fi nd a subset of the chapters of interest, or work through the chapters in a different order. Maria Keet is an Associate Professor with the Department of Computer Science, University of Cape Town, South Africa. She received her PhD in Computer Science in 2008 at the KRDB Research Centre, Free University of Bozen-Bolzano, Italy. Her research focus is on knowledge engineering with ontologies and Ontology, and their interaction with natural language and conceptual data modelling, which has resulted in over 100 peer-reviewed publications. She has developed and taught multiple courses on ontology engineering and related courses at various universities since 2009.
Studying Simulations with Distributed Cognition
Author: Jonas Rybing
Publisher: Linköping University Electronic Press
ISBN: 9176853489
Category :
Languages : en
Pages : 115
Book Description
Simulations are frequently used techniques for training, performance assessment, and prediction of future outcomes. In this thesis, the term “human-centered simulation” is used to refer to any simulation in which humans and human cognition are integral to the simulation’s function and purpose (e.g., simulation-based training). A general problem for human-centered simulations is to capture the cognitive processes and activities of the target situation (i.e., the real world task) and recreate them accurately in the simulation. The prevalent view within the simulation research community is that cognition is internal, decontextualized computational processes of individuals. However, contemporary theories of cognition emphasize the importance of the external environment, use of tools, as well as social and cultural factors in cognitive practice. Consequently, there is a need for research on how such contemporary perspectives can be used to describe human-centered simulations, re-interpret theoretical constructs of such simulations, and direct how simulations should be modeled, designed, and evaluated. This thesis adopts distributed cognition as a framework for studying human-centered simulations. Training and assessment of emergency medical management in a Swedish context using the Emergo Train System (ETS) simulator was adopted as a case study. ETS simulations were studied and analyzed using the distributed cognition for teamwork (DiCoT) methodology with the goal of understanding, evaluating, and testing the validity of the ETS simulator. Moreover, to explore distributed cognition as a basis for simulator design, a digital re-design of ETS (DIGEMERGO) was developed based on the DiCoT analysis. The aim of the DIGEMERGO system was to retain core distributed cognitive features of ETS, to increase validity, outcome reliability, and to provide a digital platform for emergency medical studies. DIGEMERGO was evaluated in three separate studies; first, a usefulness, usability, and facevalidation study that involved subject-matter-experts; second, a comparative validation study using an expert-novice group comparison; and finally, a transfer of training study based on self-efficacy and management performance. Overall, the results showed that DIGEMERGO was perceived as a useful, immersive, and promising simulator – with mixed evidence for validity – that demonstrated increased general self-efficacy and management performance following simulation exercises. This thesis demonstrates that distributed cognition, using DiCoT, is a useful framework for understanding, designing and evaluating simulated environments. In addition, the thesis conceptualizes and re-interprets central constructs of human-centered simulation in terms of distributed cognition. In doing so, the thesis shows how distributed cognitive processes relate to validity, fidelity, functionality, and usefulness of human-centered simulations. This thesis thus provides a new understanding of human-centered simulations that is grounded in distributed cognition theory.
Publisher: Linköping University Electronic Press
ISBN: 9176853489
Category :
Languages : en
Pages : 115
Book Description
Simulations are frequently used techniques for training, performance assessment, and prediction of future outcomes. In this thesis, the term “human-centered simulation” is used to refer to any simulation in which humans and human cognition are integral to the simulation’s function and purpose (e.g., simulation-based training). A general problem for human-centered simulations is to capture the cognitive processes and activities of the target situation (i.e., the real world task) and recreate them accurately in the simulation. The prevalent view within the simulation research community is that cognition is internal, decontextualized computational processes of individuals. However, contemporary theories of cognition emphasize the importance of the external environment, use of tools, as well as social and cultural factors in cognitive practice. Consequently, there is a need for research on how such contemporary perspectives can be used to describe human-centered simulations, re-interpret theoretical constructs of such simulations, and direct how simulations should be modeled, designed, and evaluated. This thesis adopts distributed cognition as a framework for studying human-centered simulations. Training and assessment of emergency medical management in a Swedish context using the Emergo Train System (ETS) simulator was adopted as a case study. ETS simulations were studied and analyzed using the distributed cognition for teamwork (DiCoT) methodology with the goal of understanding, evaluating, and testing the validity of the ETS simulator. Moreover, to explore distributed cognition as a basis for simulator design, a digital re-design of ETS (DIGEMERGO) was developed based on the DiCoT analysis. The aim of the DIGEMERGO system was to retain core distributed cognitive features of ETS, to increase validity, outcome reliability, and to provide a digital platform for emergency medical studies. DIGEMERGO was evaluated in three separate studies; first, a usefulness, usability, and facevalidation study that involved subject-matter-experts; second, a comparative validation study using an expert-novice group comparison; and finally, a transfer of training study based on self-efficacy and management performance. Overall, the results showed that DIGEMERGO was perceived as a useful, immersive, and promising simulator – with mixed evidence for validity – that demonstrated increased general self-efficacy and management performance following simulation exercises. This thesis demonstrates that distributed cognition, using DiCoT, is a useful framework for understanding, designing and evaluating simulated environments. In addition, the thesis conceptualizes and re-interprets central constructs of human-centered simulation in terms of distributed cognition. In doing so, the thesis shows how distributed cognitive processes relate to validity, fidelity, functionality, and usefulness of human-centered simulations. This thesis thus provides a new understanding of human-centered simulations that is grounded in distributed cognition theory.
Ontology in Information Science
Author: Ciza Thomas
Publisher: BoD – Books on Demand
ISBN: 9535138871
Category : Computers
Languages : en
Pages : 312
Book Description
The book on Ontology in Information Science explores a broad set of ideas and presents some of the state-of-the-art research in this field concisely in 12 chapters. This book provides researchers and practitioners working in the field of ontology and information science an opportunity to share their theories, methodologies, experiences, and experimental results related to ontology development and application in various areas. It also includes the design aspects of domain ontologies considering the architecture, development strategy, and selection of tools. The intended audience of this book will mainly consist of researchers, research students, and practitioners in the field of ontology and information science.
Publisher: BoD – Books on Demand
ISBN: 9535138871
Category : Computers
Languages : en
Pages : 312
Book Description
The book on Ontology in Information Science explores a broad set of ideas and presents some of the state-of-the-art research in this field concisely in 12 chapters. This book provides researchers and practitioners working in the field of ontology and information science an opportunity to share their theories, methodologies, experiences, and experimental results related to ontology development and application in various areas. It also includes the design aspects of domain ontologies considering the architecture, development strategy, and selection of tools. The intended audience of this book will mainly consist of researchers, research students, and practitioners in the field of ontology and information science.
Brain-Inspired Cognitive Architectures for Artificial Intelligence: BICA*AI 2020
Author: Alexei V. Samsonovich
Publisher: Springer Nature
ISBN: 3030655962
Category : Technology & Engineering
Languages : en
Pages : 613
Book Description
The book focuses on original approaches intended to support the development of biologically inspired cognitive architectures. It bridges together different disciplines, from classical artificial intelligence to linguistics, from neuro- and social sciences to design and creativity, among others. The chapters, based on contributions presented at the Eleventh Annual Meeting of the BICA Society, held on November 10-14, 2020, in Natal, Brazil, discuss emerging methods, theories and ideas towards the realization of general-purpose humanlike artificial intelligence or fostering a better understanding of the ways the human mind works. All in all, the book provides engineers, mathematicians, psychologists, computer scientists and other experts with a timely snapshot of recent research and a source of inspiration for future developments in the broadly intended areas of artificial intelligence and biological inspiration.
Publisher: Springer Nature
ISBN: 3030655962
Category : Technology & Engineering
Languages : en
Pages : 613
Book Description
The book focuses on original approaches intended to support the development of biologically inspired cognitive architectures. It bridges together different disciplines, from classical artificial intelligence to linguistics, from neuro- and social sciences to design and creativity, among others. The chapters, based on contributions presented at the Eleventh Annual Meeting of the BICA Society, held on November 10-14, 2020, in Natal, Brazil, discuss emerging methods, theories and ideas towards the realization of general-purpose humanlike artificial intelligence or fostering a better understanding of the ways the human mind works. All in all, the book provides engineers, mathematicians, psychologists, computer scientists and other experts with a timely snapshot of recent research and a source of inspiration for future developments in the broadly intended areas of artificial intelligence and biological inspiration.
System-Level Design of GPU-Based Embedded Systems
Author: Arian Maghazeh
Publisher: Linköping University Electronic Press
ISBN: 9176851753
Category :
Languages : en
Pages : 81
Book Description
Modern embedded systems deploy several hardware accelerators, in a heterogeneous manner, to deliver high-performance computing. Among such devices, graphics processing units (GPUs) have earned a prominent position by virtue of their immense computing power. However, a system design that relies on sheer throughput of GPUs is often incapable of satisfying the strict power- and time-related constraints faced by the embedded systems. This thesis presents several system-level software techniques to optimize the design of GPU-based embedded systems under various graphics and non-graphics applications. As compared to the conventional application-level optimizations, the system-wide view of our proposed techniques brings about several advantages: First, it allows for fully incorporating the limitations and requirements of the various system parts in the design process. Second, it can unveil optimization opportunities through exposing the information flow between the processing components. Third, the techniques are generally applicable to a wide range of applications with similar characteristics. In addition, multiple system-level techniques can be combined together or with application-level techniques to further improve the performance. We begin by studying some of the unique attributes of GPU-based embedded systems and discussing several factors that distinguish the design of these systems from that of the conventional high-end GPU-based systems. We then proceed to develop two techniques that address an important challenge in the design of GPU-based embedded systems from different perspectives. The challenge arises from the fact that GPUs require a large amount of workload to be present at runtime in order to deliver a high throughput. However, for some embedded applications, collecting large batches of input data requires an unacceptable waiting time, prompting a trade-off between throughput and latency. We also develop an optimization technique for GPU-based applications to address the memory bottleneck issue by utilizing the GPU L2 cache to shorten data access time. Moreover, in the area of graphics applications, and in particular with a focus on mobile games, we propose a power management scheme to reduce the GPU power consumption by dynamically adjusting the display resolution, while considering the user's visual perception at various resolutions. We also discuss the collective impact of the proposed techniques in tackling the design challenges of emerging complex systems. The proposed techniques are assessed by real-life experimentations on GPU-based hardware platforms, which demonstrate the superior performance of our approaches as compared to the state-of-the-art techniques.
Publisher: Linköping University Electronic Press
ISBN: 9176851753
Category :
Languages : en
Pages : 81
Book Description
Modern embedded systems deploy several hardware accelerators, in a heterogeneous manner, to deliver high-performance computing. Among such devices, graphics processing units (GPUs) have earned a prominent position by virtue of their immense computing power. However, a system design that relies on sheer throughput of GPUs is often incapable of satisfying the strict power- and time-related constraints faced by the embedded systems. This thesis presents several system-level software techniques to optimize the design of GPU-based embedded systems under various graphics and non-graphics applications. As compared to the conventional application-level optimizations, the system-wide view of our proposed techniques brings about several advantages: First, it allows for fully incorporating the limitations and requirements of the various system parts in the design process. Second, it can unveil optimization opportunities through exposing the information flow between the processing components. Third, the techniques are generally applicable to a wide range of applications with similar characteristics. In addition, multiple system-level techniques can be combined together or with application-level techniques to further improve the performance. We begin by studying some of the unique attributes of GPU-based embedded systems and discussing several factors that distinguish the design of these systems from that of the conventional high-end GPU-based systems. We then proceed to develop two techniques that address an important challenge in the design of GPU-based embedded systems from different perspectives. The challenge arises from the fact that GPUs require a large amount of workload to be present at runtime in order to deliver a high throughput. However, for some embedded applications, collecting large batches of input data requires an unacceptable waiting time, prompting a trade-off between throughput and latency. We also develop an optimization technique for GPU-based applications to address the memory bottleneck issue by utilizing the GPU L2 cache to shorten data access time. Moreover, in the area of graphics applications, and in particular with a focus on mobile games, we propose a power management scheme to reduce the GPU power consumption by dynamically adjusting the display resolution, while considering the user's visual perception at various resolutions. We also discuss the collective impact of the proposed techniques in tackling the design challenges of emerging complex systems. The proposed techniques are assessed by real-life experimentations on GPU-based hardware platforms, which demonstrate the superior performance of our approaches as compared to the state-of-the-art techniques.
Big Data Computing
Author: Rajendra Akerkar
Publisher: CRC Press
ISBN: 1466578386
Category : Business & Economics
Languages : en
Pages : 562
Book Description
Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix
Publisher: CRC Press
ISBN: 1466578386
Category : Business & Economics
Languages : en
Pages : 562
Book Description
Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix