<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="/resources/xsl/jats-html.xsl"?>
<article xml:lang="en" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
  <front>
    <journal-meta>
      <journal-id journal-id-type="publisher-id">kbslis</journal-id>
      <journal-title-group>
        <journal-title>Journal of the Korean BIBLIA Society for library and Information Science</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">1229-2435</issn>
      <publisher>
        <publisher-name>Korean Biblia Society for Library and Information Science</publisher-name>
      </publisher>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="publisher-id">BBROBV_2014_v25n1_5</article-id>
      <article-id pub-id-type="doi">10.14699/kbiblia.2014.25.1.005</article-id>
      <article-categories>
        <subj-group>
          <subject>Articles</subject>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title>Digital Library Evaluation Criteria: What do Users Want?<xref ref-type="author-notes" rid="fn1">*</xref></article-title>
        <trans-title-group xml:lang="ko">
          <trans-title>디지털 도서관 평가기준: 이용자들이 원하는 것은 무엇인가?</trans-title>
        </trans-title-group>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author" corresp="yes">
          <name name-style="western" xml:lang="en">
            <surname>Xie</surname>
            <given-names>Iris</given-names>
          </name>
          <xref ref-type="aff" rid="aff1">**</xref>
        </contrib>
        <contrib contrib-type="author">
          <name name-style="western" xml:lang="en">
            <surname>Joo</surname>
            <given-names>Soohyung</given-names>
          </name>
          <xref ref-type="aff" rid="aff2">***</xref>
        </contrib>
        <contrib contrib-type="author">
          <name name-style="western" xml:lang="en">
            <surname>Matusiak</surname>
            <given-names>Krystyna K.</given-names>
          </name>
          <xref ref-type="aff" rid="aff3">****</xref>
        </contrib>
      </contrib-group>
      <aff id="aff1">
        <label>**</label>School of Information Studies, University of Wisconsin-Milwaukee(<email>hiris@uwm.edu</email>) (first and corresponding author)
      </aff>
      <aff id="aff2">
        <label>***</label>School of Information Studies, University of Wisconsin-Milwaukee(<email>sjoo@uwm.edu</email>)
      </aff>
      <aff id="aff3">
        <label>****</label>Morgridge College of Education, University of Denver(<email>Krystyna.Matusiak@du.edu</email>)
      </aff>
	  <author-notes>
	   <fn id="fn1">
        <label>*</label><p>This study was presented at the joint conference of Korean Biblia Society for Library and Information Science and the University of Wisconsin-Milwaukee in 2013.</p>
      </fn>
	  </author-notes>
      <pub-date pub-type="ppub">
        <day>30</day>
        <month>03</month>
        <year>2014</year>
      </pub-date>
      <volume>25</volume>
      <issue>1</issue>
      <fpage>5</fpage>
      <lpage>18</lpage>
      <history>
        <date date-type="received">
          <day>17</day>
          <month>02</month>
          <year>2014</year>
        </date>
        <date date-type="accepted">
          <day>13</day>
          <month>03</month>
          <year>2014</year>
        </date>
      </history>
      <permissions>
        <copyright-statement>Copyright &#x000a9; 2014, Korean Biblia Society for Library and Information Science</copyright-statement>
        <copyright-year>2014</copyright-year>
      </permissions>
      <abstract>
        <p>Criteria for evaluating digital libraries have been suggested in prior studies, but limited research has been done to understand users&#x2019; perceptions of evaluation criteria. This study investigates users&#x2019; opinions of the importance of digital library evaluation criteria. Thirty user participants, including 10 faculty members and 20 students, were recruited from five universities across the United States. They were asked to rate the importance of evaluation criteria in eight dimensions (e.g. collections, information organization, context, etc.). The results demonstrate that users care about evaluation criteria related to the use and quality of collection and services rather than the operation of digital libraries. The findings of this study are relevant to the development of user-centered digital libraries and associated evaluation frameworks through the incorporation of unique users&#x2019; needs and preferences.</p>
      </abstract>
      <trans-abstract xml:lang="ko">
        <p>기존의 여러 연구들에서 디지털 도서관 평가를 위한 기준들이 제시되어 왔으나, 대부분의 연구에서 평가기준에 대한 이용자 관점을 이해하려는 노력이 부족하였다. 본 연구는 디지털 도서관 평가 기준들에 대한 이용자들의 의견을 미국 내 5개 대학교에서 10명의 교수 이용자와 20명의 학생 이용자에게 설문을 통해 직접 조사하였다. 설문 참여자들은 본 연구진에 의해 제시된 8개 영역 (장서, 정보조직, 맥락 등) 내 평가 기준들의 중요성에 대해 7점 척도로 응답하였다. 설문의 결과는 이용자들이 장서의 이용과 품질, 서비스와 관련된 항목들을 도서관 운영항목에 비해 상대적으로 더 중요하게 생각하고 있었음을 보여주었다. 본 연구의 결과는 이용자 중심의 디지털 도서관 개발과 관련하여 이용자의 요구와 선호를 반영하는 도서관 평가체계 구축에 도움이 될 것이다.</p>
      </trans-abstract>
      <kwd-group kwd-group-type="author" xml:lang="en">
        <kwd>digital libraries</kwd>
        <kwd>digital library evaluation</kwd>
        <kwd>evaluation criteria</kwd>
        <kwd>user perspectives</kwd>
        <kwd>evaluation dimensions</kwd>
      </kwd-group>
      <kwd-group kwd-group-type="author" xml:lang="ko">
        <kwd>디지털 도서관</kwd>
        <kwd>디지털 도서관 평가</kwd>
        <kwd>평가 기준</kwd>
        <kwd>이용자 관점</kwd>
        <kwd>평가 영역</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec sec-type="intro">
      <title>1. Introduction and Literature Review</title>
      <p>Digital libraries (DLs) have emerged as one of the essential scholarly information systems in support of research and teaching. Many libraries have digitized their collections, such as pictures, books, or audios, to make them available on the web. Digital libraries are relatively new phenomena and, like many new and emergent information systems, face challenges of acceptance, utilization, and evaluation. The concept of a digital library has been perceived in different ways by different groups of people. To the community of librarians and LIS researchers, a digital library is an extension and augmentation of library services combined with a remote access to digitized resources. To computer scientists, a digital library is a distributed information system or networked multimedia information system (<xref ref-type="bibr" rid="r004">Fox et al. 1995</xref>). In this study, our focus is on the users&#x2019; perspective. A digital library is defined as the collection of digitized or digitally born items that are stored, managed, serviced, and preserved by digital library professionals.</p>
      <p>The exponential growth of DLs has created a need for the evaluation of these emergent information systems. Digital library evaluation has become a critical issue for both professionals and researchers. DLs have their own unique characteristics and features compared to traditional library services. Also, as an information retrieval system, digital libraries are fairly different from other types of online information retrieval systems (e.g., search engines, online databases, OPACs). DLs have well-organized metadata, browsing categories, as well as digitized items in different formats. Therefore, previous evaluation frameworks used in traditional libraries or other information retrieval systems proved to be insufficient in assessing different aspects of DLs. The evaluation of DLs is conceptually complex and pragmatically challenging (<xref ref-type="bibr" rid="r015">Saracevic and Covi 2000</xref>). <xref ref-type="bibr" rid="r001">Borgman et al. (2000)</xref> also pointed out technical complexity, variety of content, and the lack of evaluation methods posed the key challenges of digital library evaluation.</p>
      <p>Researchers have exerted efforts to develop new frameworks and methods of digital library evaluation. A number of evaluation criteria were suggested covering different dimensions of digital libraries. The early DL projects, funded by the National Science Foundation (NSF) as part of Digital Libraries Initiatives I and II, laid groundwork in evaluation research by producing DL prototypes and frameworks (<xref ref-type="bibr" rid="r001">Borgman et al. 2000</xref>; <xref ref-type="bibr" rid="r002">Buttenfield 1999</xref>; <xref ref-type="bibr" rid="r008">Hill et al. 2000</xref>; <xref ref-type="bibr" rid="r018">Van House et al. 1996</xref>). In particular, <xref ref-type="bibr" rid="r017">Hill et al. (1997)</xref> identified several criteria for digital library evaluation, such as ease of use, overall appeal, usefulness and overall performance. <xref ref-type="bibr" rid="r016">Saracevic (2004)</xref> identified six classes of criteria: content, technology, interface, process/service, user, and context. This evaluation framework covers multiple aspects of digital libraries comprehensively, and is one of the first attempts to assess the context aspect in digital libraries. <xref ref-type="bibr" rid="r019">Xie&#x2019;s (2006</xref>/<xref ref-type="bibr" rid="r020">2008)</xref> evaluation framework shifted a focus to the users and posited five types of criteria: usability, collection quality, service quality, system performance efficiency, and user feedback solicitation. In particular, when developing the evaluation framework, she analyzed users&#x2019; perceptions based on diaries, questionnaires, and interviews. <xref ref-type="bibr" rid="r022">Zhang (2010)</xref> validated the evaluation criteria for digital libraries. Based on <xref ref-type="bibr" rid="r016">Saracevic's (2004)</xref> framework, she investigated the importance of evaluation criteria using the empirical survey data from heterogeneous stakeholders. <xref ref-type="bibr" rid="r014">Noh (2010)</xref> identified multiple dimensions and corresponding evaluation indices for electronic resources.</p>
      <p>In Europe, DELOS Network of Excellence has conducted a series of projects regarding the evaluation of digital libraries. DELOS is a comprehensive and large scale DL project, which represents joint activities aimed at integrating and coordinating the ongoing research efforts of the major European teams working in the digital library area. <xref ref-type="bibr" rid="r003">Candela et al. (2007)</xref> established DELOS Manifesto which presents a three-tier DL framework incorporating six core components such as content, functionality, quality, policy, architecture, and user. <xref ref-type="bibr" rid="r005">Fuhr et al. (2001)</xref> proposed an evaluation scheme for digital libraries which covers four dimensions including data/collection, system/technology, users, and usage. Based on the examination of the interactions amongst digital library components, <xref ref-type="bibr" rid="r017">Tsakonas et al. (2004)</xref> proposed the major evaluation foci, such as usability, usefulness, and system performance. <xref ref-type="bibr" rid="r006">Fuhr et al. (2007)</xref> developed a DL evaluation framework based on a DELOS model and conducted a large-scale survey of DL evaluation activities.</p>
      <p>Among different aspects of digital libraries, usability has been one of the major concerns in evaluation. Usability consists of multiple attributes from various perspectives such as learnability, efficiency, effectiveness, memorability, errors, and satisfaction (<xref ref-type="bibr" rid="r013">Nielson 1993</xref>). Several researchers tried to suggest a usability evaluation model tailored to digital libraries. For example, <xref ref-type="bibr" rid="r009">Jeng (2005)</xref> suggested an evaluation framework on usability of academic DLs focusing on four attributes: effectiveness, efficiency, satisfaction, and learnability. <xref ref-type="bibr" rid="r021">Ward and Hiller (2005)</xref> suggested usability evaluation criteria specific to library services, such as completion of the task, time and effort, and reaction to the product or service. <xref ref-type="bibr" rid="r010">Joo and Lee (2011)</xref> developed an instrument tool to measure the usability of digital libraries. They further tested the validity and reliability of the tool. <xref ref-type="bibr" rid="r012">Matusiak (2012)</xref> examined the relationship between usability and usefulness, and found that user perceptions of usefulness and usability, especially perceived ease of use, play an important role in user intentions to adopt and use digital collections.</p>
      <p>Researchers strived to identify a set of evaluation criteria for digital libraries. However, there has been relatively less effort devoted to the investigation of users&#x2019; perceptions in selecting evaluation criteria. Ultimately, digital libraries are developed to provide information and services to users, and users&#x2019; opinions should be considered in the evaluation of digital libraries. It is important that all efforts in the evaluation of digital libraries should be rooted in users&#x2019; information needs and characteristics as well as contexts involving the users of those libraries (<xref ref-type="bibr" rid="r011">Marchionini et al. 1998</xref>).</p>
      <p>This study is one of the few attempts to survey users&#x2019; perceptions of evaluation criteria for digital libraries. The investigation of user perceptions is a fundamental step in devising an evaluation framework that focuses on user needs and characteristics. In this study, the authors suggested a wide range of evaluation criteria in eight dimensions of digital libraries based on the document analysis. For the suggested evaluation criteria, this study examines to what extent users perceive the importance of each criterion in the evaluation of digital libraries.</p>
    </sec>
    <sec id="s2" sec-type="methods">
      <title>2. Research Problem and Research Question</title>
      <p>Thus far, digital library evaluation criteria have been suggested mainly by librarians or researchers. To design a user-centered digital library, the evaluation needs to reflect users&#x2019; perspective in its evaluation criteria. This study is one of a few studies that investigated users&#x2019; perceptions of evaluation criteria for digital libraries.</p>
      <p>This study intends to examine the following research question:</p>
      <p>What are users&#x2019; perceptions of the importance of digital library evaluation criteria?</p>
      <sec id="s2a">
        <title>2.1 METHODOLOGY</title>
        <p>Two-round surveys were conducted to identify the importance of evaluation criteria and appropriateness of measures from different stakeholders of digital libraries including scholars, digital librarians, and users. This paper focuses on the identification of the importance of evaluation criteria from users&#x2019; perspectives.</p>
        <p>The authors partnered with five academic libraries across the United States to collect data. Subjects of this study were recruited from these partner libraries: (1) University of Denver, (2) University of Florida, (3) University of Nevada Las Vegas, (4) Drake University, and (5) University of Wisconsin-Milwaukee. Each institution recruited six digital library users to participate in the study. The study employed purposeful sampling strategy. The sample included academic users with prior experience interacting with digital collections. To ensure the maximum variation sampling, participants were recruited from different groups of academic users with different gender and different majors, such as Linguistics, English, Psychology, and Computer Science. The user subjects included 10 faculty, 12 graduate students, and 8 undergraduate students. A $30 gift card was given to each subject as an incentive for his/her participation of the study. <xref ref-type="table" rid="t001">Table 1</xref> presents the demographic data of the subjects.</p>
        <table-wrap id="t001" position="float">
          <label>&#x3C; Table 1&#x3E;</label>
          <caption>
            <title>Demographic data of subjects</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t001.jpg" position="float"/>
        </table-wrap>
        <p>A comprehensive survey was administered to investigate users&#x2019; perceptions of the importance of evaluation criteria in digital library evaluation. To suggest an initial set of evaluation criteria, a comprehensive document analysis was conducted. Using keywords of different combinations of &#x201C;digital library&#x201D;, &#x201C;evaluation&#x201D;, &#x201C;criteria&#x201D;, and other terms, relevant research papers were collected through Google Scholar and EBSCOhost databases. In addition, five websites related to digital libraries were also analyzed, such as DigiQUAL and DELOS, to identify evaluation criteria.</p>
        <p>Digital library evaluation criteria were extracted from the retrieved pool of documents. Based on the document analysis, ten essential dimensions of digital libraries have been identified, including collection, information organization, interface design, system and technology, effects on users, services, preservation, administration, user engagement, and context. The administration and preservation dimensions were excluded from the user survey because users don’t have enough knowledge of these dimensions. For each dimension, the authors proposed a set of evaluation criteria. To help subjects understand the meaning of evaluation criteria, the definition associated with each criterion was provided in the survey. Subjects were instructed to rate the importance of evaluation criteria using a seven point scale. Since the subjects were selected from different locations with different digital library uses, the subjects were not asked to evaluate a specific digital library, but rather based their survey responses on their past interactions with digital library systems.</p>
        <p>Descriptive statistics was used, including mean and standard deviation, to investigate the importance of evaluation criteria. Based on average ratings, the authors ranked the evaluation criteria from the most important to the least for each dimension.</p>
      </sec>
      <sec id="s2b" sec-type="results">
        <title>2.2 RESULTS</title>
        <p>The results section is organized in eight dimensions of digital libraries: collection, information organization, interface design, system and technology, effects on users, services, user engagement, and context. In the dimension of collections, quality related evaluation criteria are the ones that users considered the most important. &#x201C;Authority (6.53)&#x201D;, &#x201C;item quality (6.27)&#x201D;, and &#x201C;digitization standards (6.20)&#x201D; turned out to be the top three evaluation criteria. Following these criteria, &#x201C;cost (6.10)&#x201D;, &#x201C;format compatibility (6.10)&#x201D; and &#x201C;contextual information (6.10)&#x201D; were ranked fourth. In contrast, &#x201C;size (5.57)&#x201D;, &#x201C;diversity (5.77)&#x201D; and &#x201C;completeness (5.77)&#x201D; were considered to be the least important. It seems that users cared more about the quality and less about the comprehensiveness and variety of the collections. <xref ref-type="table" rid="t002">Table 2</xref> presents the importance of evaluation criteria in the dimension of collections.</p>
        <table-wrap id="t002" position="float">
          <label>&#x3C;Table 2&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of collectionsjavascript:;</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t002.jpg" position="float"/>
        </table-wrap>
        <p>For the dimension of information organization, users perceived metadata as the key. In particular, accuracy and consistency of metadata are the most important criteria in assessing the organization of digital libraries. &#x201C;Metadata accuracy (6.28)&#x201D;, &#x201C;consistency (6.24)&#x201D; and &#x201C;depth of metadata (6.21)&#x201D; were ranked 1st, 2nd, and 3rd respectively. &#x201C;Comprehensiveness (6.10)&#x201D;, &#x201C;accessibility to metadata (6.07)&#x201D;, and &#x201C;appropriateness (6.03)&#x201D; also received high scores. On the other hand, users did not consider highly the evaluation criteria that professionals care in developing digital libraries. &#x201C;Metadata interoperability (5.48)&#x201D;, &#x201C;controlled vocabulary (5.69)&#x201D;, and &#x201C;metadata standards (5.86)&#x201D; were perceived the least important. <xref ref-type="table" rid="t003">Table 3</xref> presents the importance of evaluation criteria in the dimension of information organization.</p>
        <table-wrap id="t003" position="float">
          <label>&#x3C;Table 3&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of information organization</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t003.jpg" position="float"/>
        </table-wrap>
        <p>In terms of interface design, the users regarded &#x201C;browsing function (6.53)&#x201D; and &#x201C;search function (6.48)&#x201D; as the most important criteria in evaluating digital libraries. Searching and browsing are the two main approaches in the information retrieval process. Browsing is a unique feature for digital libraries because of the nature of the digital collections. Users perceived these two criteria as the important criteria in digital library interface design. &#x201C;Navigation (6.36)&#x201D; and &#x201C;reliability (6.28)&#x201D; were also chosen as important evaluation criteria by the user group. However, &#x201C;personalized page (4.17)&#x201D;, &#x201C;user control (5.14)&#x201D;, and &#x201C;visual appeal (5.59)&#x201D; were rated least important in this dimension. In this study, customized features were not deemed as important as assessment criteria. <xref ref-type="table" rid="t004">Table 4</xref> presents the importance of evaluation criteria in the dimension of interface design.</p>
        <table-wrap id="t004" position="float">
          <label>&#x3C;Table 4&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of interface design</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t004.jpg" position="float"/>
        </table-wrap>
        <p>As to the dimension of system and technology, effectiveness and reliability of digital libraries are the key evaluation criteria to users. &#x201C;Response time (6.26)&#x201D;, &#x201C;retrieval effectiveness (6.25)&#x201D;, and &#x201C;reliability (6.25)&#x201D; turned out the most important criteria from the user perspective in DL evaluation. As DLs are considered as one type of the information retrieval systems, the subjects thought that response time and retrieval effectiveness (e.g., precision, recall, etc.) would be important in evaluating the performance of digital library systems. Reliability is a criterion needed to provide stable services to users in digital libraries. &#x201C;Server performance (5.93)&#x201D;, &#x201C;fit-to-task (5.93)&#x201D;, and &#x201C;error rate/ error correction (5.93)&#x201D; were tied as the fifth. Less important criteria were &#x201C;linkage with other digital libraries (5.36)&#x201D;, &#x201C;integrated search (5.75)&#x201D;, and &#x201C;flexibility (5.82)&#x201D;. Comparatively speaking, users cared less about the ability to integrate different collections within the digital library environment. <xref ref-type="table" rid="t005">Table 5</xref> presents the importance of evaluation criteria in the dimension of system and technology.</p>
        <table-wrap id="t005" position="float">
          <label>&#x3C;Table 5&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of system and technology</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t005.jpg" position="float"/>
        </table-wrap>
        <p>In users&#x2019; ratings of the criteria in the dimension of effects on users, research output and learning effects are essential to users because these are related to their goals in the academic world. They perceived research and learning as the most important aspects of effects on users to be assessed in digital libraries. &#x201C;Research productivity (5.89)&#x201D; and &#x201C;learning effects (5.46)&#x201D; were chosen as the two most important criteria. Following that, &#x201C;instructional efficiency (5.32)&#x201D; and &#x201C;knowledge change (5.26)&#x201D; were ranked at third and fourth respectively. On the contrary, evaluation criteria that general users care about are comparatively less important to them. &#x201C;Information literacy/ skill change (5.00)&#x201D; and &#x201C;perceptions of digital libraries (5.11)&#x201D; were regarded relatively less important. <xref ref-type="table" rid="t006">Table 6</xref> presents the importance of evaluation criteria in the dimension of effects on users.</p>
        <table-wrap id="t006" position="float">
          <label>&#x3C;Table 6&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of effects on users</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t006.jpg" position="float"/>
        </table-wrap>
        <p>In the dimension of services, the subjects again considered reliability and quality of service as important criteria. &#x201C;Services for users with disabilities (6.43)&#x201D;, &#x201C;reliability (6.39)&#x201D; and &#x201C;service quality (6.36)&#x201D; were selected as three most important criteria. Interestingly, the subjects thought that the evaluation should reflect types of services tailored to users with disabilities (e.g., blind users, visually impaired users, etc.). The next three criteria (ranked 2nd, 3rd, and 4th) -- &#x201C;reliability,&#x201D; &#x201C;service quality&#x201D; and &#x201C;user satisfaction&#x201D; -- are commonly used evaluation criteria for services in other types of information systems. &#x201C;Usefulness (6.29)&#x201D;, &#x201C;responsiveness (6.21)&#x201D;, and &#x201C;timeliness (6.11)&#x201D; are ranked 5th, 6th, and 7th respectively by showing rating scores above 6. On the other hand, &#x201C;customized services (4.89)&#x201D;, &#x201C;types of unique services (5.14)&#x201D;, and &#x201C;user education (5.36)&#x201D; were ranked the least important criteria. It was noted that users thought &#x201C;customized services&#x201D; comparatively less important. Apparently, users expected less in regard to special services. <xref ref-type="table" rid="t007">Table 7</xref> presents the importance of evaluation criteria in the dimension of services.</p>
        <table-wrap id="t007" position="float">
          <label>&#x3C;Table 7&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of services</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t007.jpg" position="float"/>
        </table-wrap>
        <p>In the dimension of user engagement, &#x201C;user feedback (5.89)&#x201D;, &#x201C;resource use (5.81)&#x201D;, and &#x201C;help feature use (5.79)&#x201D; were the three highly rated evaluation criteria by the user group. &#x201C;User feedback&#x201D; is one of the explicit and direct communication channels between users and digital libraries. &#x201C;Resource use&#x201D; is one of the fundamental criteria in library assessment, and it is also perceived as an important evaluation criterion in the context of digital library. Since digital libraries represent a new type of IR system, users still need to use help features in order to effectively access digital libraries. On the other hand, &#x201C;e-commerce support (4.89)&#x201D; and &#x201C;user knowledge contribution (5.00)&#x201D; were perceived less important in evaluation. <xref ref-type="table" rid="t008">Table 8</xref> presents the importance of evaluation criteria in the dimension of user engagement.</p>
        <table-wrap id="t008" position="float">
          <label>&#x3C;Table 8&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of user engagement</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t008.jpg" position="float"/>
        </table-wrap>
        <p>Finally, the subjects selected &#x201C;information ethics compliance (6.61)&#x201D;, &#x201C;copyright (6.25)&#x201D;, and &#x201C;content sharing (6.00)&#x201D; as most important evaluation criteria in the dimension of context. In particular, the subjects gave a comparatively higher score for &#x201C;information ethics compliance&#x201D; because they are academic users instead of general users. Following the top criteria, &#x201C;targeted user community (5.75)&#x201D; and &#x201C;collaboration (5.75)&#x201D; were tied by being ranked fourth. Comparatively speaking, &#x201C;social impact (5.18)&#x201D; and &#x201C;organizational mission (5.43)&#x201D; were considered less important in relation to context evaluation. This group of users cared more on rules and policies than the impact of digital libraries on society and organization. <xref ref-type="table" rid="t009">Table 9</xref> presents the importance of evaluation criteria in the dimension of context.</p>
        <table-wrap id="t009" position="float">
          <label>&#x3C;Table 9&#x3E;</label>
          <caption>
            <title>Importance of evaluation criteria in the dimension of context</title>
          </caption>
          <graphic xlink:href="../ingestImageView?artiId=ART001860306&amp;imageName=BBROBV_2014_v25n1_5_t009.jpg" position="float"/>
        </table-wrap>
      </sec>
    </sec>
    <sec id="s3" sec-type="discussion|conclusions">
      <title>3. Discussion and Conclusion</title>
      <p>Identifying evaluation criteria is essential for the successful evaluation of digital libraries. Previous research suggested a variety of evaluation criteria in different dimensions of digital libraries. However, users&#x2019; perspective has not been sufficiently investigated in the digital library evaluation framework. The present study investigated users&#x2019; opinions on the importance of evaluation criteria for digital library evaluation. The unique contribution of this study lies in the comprehensive examination of users&#x2019; perception of evaluation criteria across different dimensions of digital libraries. The ratings of evaluation criteria showed the most and least important criteria from users&#x2019; perspective. Practically, the findings of this study assist library professionals in making their decisions in regard to selecting appropriate evaluation criteria for different evaluation objectives.</p>
      <p>Different stakeholders identify their DL evaluation criteria based on their needs, background, their own interests, and familiarity with DL concepts. Users rank higher the criteria related directly to use of collections, such as authority in collections, or metadata accuracy in information organization, or the quality of services. Their rankings reflect the expectations of digital library users. Users think less about the cost and efforts required for building DLs, which are of concern to digital librarians. The top criteria selected by users indicate what they care the most in digital library use. For example, in information organization, they rated accuracy, consistency and in-depth of metadata as the top criteria. Accuracy is, of course, essential for users to actually use the metadata for their research and learning, and in-depth of metadata is useful for them to obtain as much information as possible for each digital item. In developing digital libraries, digital librarians need to consider users&#x2019; needs in regard to quality of collections, metadata and services. Interface design needs to offer multiple options for users to access the documents. At the same time, interface design needs to consider the special requirements from users with a variety of disabilities. Reliable and effective system performance is a key requirement for users to access digital libraries.</p>
      <p>However, not all users are the same. The subjects of this study represent academic user group of digital libraries. In addition to general users&#x2019; perceptions of digital library evaluation criteria, they also have their unique needs and opinions because of their academic background. In the dimension of effects on users, they ranked research productivity and learning effects as their top choices. Research and learning are the two academic goals for this group of users. The design of digital libraries in academic settings needs to put research and learning as the priority in terms of collection, metadata and interface design. For example, librarians need to work with instructors to determine the types of metadata needed for learning purposes in the development of related digital collections. In providing digital services, librarians have to come up with ideas that tailor to the needs and characteristics of academic users. In the dimension of context, subjects of this study chose &#x201C;information ethics compliance&#x201D;, &#x201C;copyright&#x201D;, and &#x201C;content sharing&#x201D; that are important to academics as the most important evaluation criteria. Digital libraries in academic settings need to provide information related to ethics compliance, copyright information and content sharing options to assure and guide users in using of digital items.</p>
      <p>Certainly, this study has several limitations. The sample size might not be sufficient to represent a variety of users of digital libraries even though they are real users of digital libraries. The results from the analysis of thirty users, including faculty, undergraduate and graduate students, cannot be generalized to general public users. Also, the authors were not able to conduct a comparison analysis between different groups (e.g., faculty vs. students) due to the relatively small sample size for statistical tests. However, the findings of this study yield insightful information on users&#x2019; perceptions of digital library evaluation. In addition, the present study investigated only the user group among different stakeholders. Although the ultimate objective of digital libraries is to serve users, end-users do not have sufficient knowledge of DL administration, collection development, or preservation techniques. Therefore, the investigation of other expert groups, such as digital librarians and scholars, is imperative to develop a comprehensive evaluation framework. Further analysis is going to investigate to examine opinions from other stakeholders including scholars and digital librarians. In addition, the authors plan to compare the opinions of those three groups to identify similarities and differences among them.</p>
    </sec>
  </body>
  <back>    
    <ref-list>
      <ref id="r001">
        <element-citation publication-type="journal">
          <annotation>
            <p>Borgman, C. L., G. H. Leazer, A. J. Gilliland-Swetland, and R. Gazan. 2000. &#x201C;Evaluating Digital Libraries for Teaching and Learning in Undergraduate Education: A Case Study of the Alexandria Digital Earth Prototype (ADEPT).&#x201D; <italic>Library Trends</italic>, 49: 228-250.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Borgman</surname>
              <given-names>C. L.</given-names>
            </name>
            <name>
              <surname>Leazer</surname>
              <given-names>G. H.</given-names>
            </name>
            <name>
              <surname>Gilliland-Swetland</surname>
              <given-names>A. J.</given-names>
            </name>
            <name>
              <surname>Gazan</surname>
              <given-names>R.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Evaluating Digital Libraries for Teaching and Learning in Undergraduate Education: A Case Study of the Alexandria Digital Earth Prototype (ADEPT).&#x201D;</article-title>
          <source>Library Trends</source>
          <year>2000</year>
          <volume>49</volume>
          <fpage>228</fpage>
          <lpage>250</lpage>
        </element-citation>
      </ref>
      <ref id="r002">
        <element-citation publication-type="journal">
          <annotation>
            <p>Buttenfield, B. 1999. &#x201C;Usability Evaluation of Digital Libraries.&#x201D; <italic>Science &#x26; Technology Libraries</italic>, 17(3/4):39-50.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Buttenfield</surname>
              <given-names>B</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Usability Evaluation of Digital Libraries.&#x201D;</article-title>
          <source>Science &#x26; Technology Libraries</source>
          <year>1999</year>
          <volume>17</volume>
          <issue>3/4</issue>
          <fpage>39</fpage>
          <lpage>50</lpage>
          <pub-id pub-id-type="doi">10.1300/J122v17n03_04</pub-id>
        </element-citation>
      </ref>
      <ref id="r003">
        <element-citation publication-type="journal">
          <annotation>
            <p>Candela, L., D. Castelli, P. Pagano, C. Thanos, Y. Ioannidis, G. Koutrika, S. Ross, H. Schek, and H. Schuldt. 2007. Setting the Foundations of Digital Libraries: The DELOS Manifesto. <italic>D-Lib Magazine</italic>, 13(3/4).[cited 2014.1.14]. &#x3C;http://www.dlib.org/dlib/march07/castelli/03castelli.html&#x3E;.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Candela</surname>
              <given-names>L.</given-names>
            </name>
            <name>
              <surname>Castelli</surname>
              <given-names>D.</given-names>
            </name>
            <name>
              <surname>Pagano</surname>
              <given-names>P.</given-names>
            </name>
            <name>
              <surname>Thanos</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Ioannidis</surname>
              <given-names>Y.</given-names>
            </name>
            <name>
              <surname>Koutrika</surname>
              <given-names>G.</given-names>
            </name>
            <name>
              <surname>Ross</surname>
              <given-names>S.</given-names>
            </name>
            <name>
              <surname>Schek</surname>
              <given-names>H.</given-names>
            </name>
            <name>
              <surname>Schuldt</surname>
              <given-names>H.</given-names>
            </name>
          </person-group>
          <article-title>Setting the Foundations of Digital Libraries: The DELOS Manifesto</article-title>
          <source>D-Lib Magazine</source>
          <year>2007</year>
          <volume>13</volume>
          <issue>3/4</issue>
        </element-citation>
      </ref>
      <ref id="r004">
        <element-citation publication-type="journal">
          <annotation>
            <p>Fox, E. A., R. M. Akscyn, R. K. Furuta, and J. J. Leggett. 1995. &#x201C;Digital Libraries.&#x201D; <italic>Communications of the ACM</italic>, 38(4): 23-28.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Fox</surname>
              <given-names>E. A.</given-names>
            </name>
            <name>
              <surname>Akscyn</surname>
              <given-names>R. M.</given-names>
            </name>
            <name>
              <surname>Furuta</surname>
              <given-names>R. K.</given-names>
            </name>
            <name>
              <surname>Leggett</surname>
              <given-names>J. J.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Digital Libraries.&#x201D;</article-title>
          <source>Communications of the ACM</source>
          <year>1995</year>
          <volume>38</volume>
          <issue>4</issue>
          <fpage>23</fpage>
          <lpage>28</lpage>
        </element-citation>
      </ref>
      <ref id="r005">
        <element-citation publication-type="confproc">
          <annotation>
            <p>Fuhr, N., P. Hansen, M. Mabe, A. Micsik, and I. Solvberg. 2001. &#x201C;Digital Libraries: A Generic Classification and Evaluation Scheme.&#x201D; <italic>Lecture Notes in Computer Science</italic>, 2163: 187-199.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Fuhr</surname>
              <given-names>N.</given-names>
            </name>
            <name>
              <surname>Hansen</surname>
              <given-names>P.</given-names>
            </name>
            <name>
              <surname>Mabe</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Micsik</surname>
              <given-names>A.</given-names>
            </name>
            <name>
              <surname>Solvberg</surname>
              <given-names>I.</given-names>
            </name>
          </person-group>
          <source>&#x201C;Digital Libraries: A Generic Classification and Evaluation Scheme.&#x201D;</source>
          <conf-name>Lecture Notes in Computer Science</conf-name>
          <year>2001</year>
          <volume>2163</volume>
          <fpage>187</fpage>
          <lpage>199</lpage>
        </element-citation>
      </ref>
      <ref id="r006">
        <element-citation publication-type="journal">
          <annotation>
            <p>Fuhr, N., G. Tsakonas, T. Aalberg, M. Agosti, P. Hansen, S. Kapidakis, C. Klas, L. Kov&#xE1;cs, M. Landoni, A. Micsik, C. Papatheodorou, C. Peters, and I. Solvberg. 2007. &#x201C;Evaluation of Digital Libraries.&#x201D;<italic>International Journal on Digital Libraries</italic>, 8(1): 21-38.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Fuhr</surname>
              <given-names>N.</given-names>
            </name>
            <name>
              <surname>Tsakonas</surname>
              <given-names>G.</given-names>
            </name>
            <name>
              <surname>Aalberg</surname>
              <given-names>T.</given-names>
            </name>
            <name>
              <surname>Agosti</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Hansen</surname>
              <given-names>P.</given-names>
            </name>
            <name>
              <surname>Kapidakis</surname>
              <given-names>S.</given-names>
            </name>
            <name>
              <surname>Klas</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Kovacs</surname>
              <given-names>L.</given-names>
            </name>
            <name>
              <surname>Landoni</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Micsik</surname>
              <given-names>A.</given-names>
            </name>
            <name>
              <surname>Papatheodorou</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Peters</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Solvberg</surname>
              <given-names>I.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Evaluation of Digital Libraries.&#x201D;</article-title>
          <source>International Journal on Digital Libraries</source>
          <year>2007</year>
          <volume>8</volume>
          <issue>1</issue>
          <fpage>21</fpage>
          <lpage>38</lpage>
          <pub-id pub-id-type="doi">10.1007/s00799-007-0011-z</pub-id>
        </element-citation>
      </ref>
      <ref id="r007">
        <element-citation publication-type="confproc">
          <annotation>
            <p>Hill, L.L., R. Dolin, J. Frew, R.B. Kemp, M. Larsgaard, D.R. Montello, M.-A. Rae, and J. Simpson. 1997.&#x201C;User Evaluation: Summary of the Methodologies and Results for the Alexandria Digital Library, University of California at Santa Barbara.&#x201D; <italic>Proceedings of 60th ASIST Annual Meeting</italic>. (pp. 225-243, 369). Medford, NJ: Information Today.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Hill</surname>
              <given-names>L.L.</given-names>
            </name>
            <name>
              <surname>Dolin</surname>
              <given-names>R.</given-names>
            </name>
            <name>
              <surname>Frew</surname>
              <given-names>J.</given-names>
            </name>
            <name>
              <surname>Kemp</surname>
              <given-names>R.B.</given-names>
            </name>
            <name>
              <surname>Larsgaard</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Montello</surname>
              <given-names>D.R.</given-names>
            </name>
            <name>
              <surname>Rae</surname>
              <given-names>M.-A.</given-names>
            </name>
            <name>
              <surname>Simpson</surname>
              <given-names>J.</given-names>
            </name>
          </person-group>
          <source>&#x201C;User Evaluation: Summary of the Methodologies and Results for the Alexandria Digital Library, University of California at Santa Barbara.&#x201D;</source>
          <conf-name>Proceedings of 60th ASIST Annual Meeting</conf-name>
          <conf-loc>Medford, NJ</conf-loc>
          <year>1997</year>
          <fpage>225</fpage>
          <lpage>243,369</lpage>
        </element-citation>
      </ref>
      <ref id="r008">
        <element-citation publication-type="journal">
          <annotation>
            <p>Hill, L. L., L. Carver, M. Larsgaard, R. Dolin, T. R. Smith, J. Frew, and M.-A. Rae. 2000. &#x201C;Alexandria Digital Library: User Evaluation Studies and System Design.&#x201D; <italic>Journal of the American Society for Information Science</italic>, 51: 246-259.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Hill</surname>
              <given-names>L. L.</given-names>
            </name>
            <name>
              <surname>Carver</surname>
              <given-names>L.</given-names>
            </name>
            <name>
              <surname>Larsgaard</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Dolin</surname>
              <given-names>R.</given-names>
            </name>
            <name>
              <surname>Smith</surname>
              <given-names>T. R.</given-names>
            </name>
            <name>
              <surname>Frew</surname>
              <given-names>J.</given-names>
            </name>
            <name>
              <surname>Rae</surname>
              <given-names>M.-A.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Alexandria Digital Library: User Evaluation Studies and System Design.&#x201D;</article-title>
          <source>Journal of the American Society for Information Science</source>
          <year>2000</year>
          <volume>51</volume>
          <fpage>246</fpage>
          <lpage>259</lpage>
          <pub-id pub-id-type="doi">10.1002/(SICI)1097-4571(2000)51:3&#x3C;246::AID-ASI4&#x3E;3.0.CO;2-6</pub-id>
        </element-citation>
      </ref>
      <ref id="r009">
        <element-citation publication-type="journal">
          <annotation>
            <p>Jeng, J. 2005. &#x201C;Usability Assessment of Academic Digital Libraries: Effectiveness, Efficiency, Satisfaction, and Learnability.&#x201D; <italic>Libri</italic>, 55: 96-121.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Jeng</surname>
              <given-names>J</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Usability Assessment of Academic Digital Libraries: Effectiveness, Efficiency, Satisfaction, and Learnability.&#x201D;</article-title>
          <source>Libri</source>
          <year>2005</year>
          <volume>55</volume>
          <fpage>96</fpage>
          <lpage>121</lpage>
        </element-citation>
      </ref>
      <ref id="r010">
        <element-citation publication-type="journal">
          <annotation>
            <p>Joo, S. and J. Lee. 2011. &#x201C;Measuring the Usability of Academic Digital Libraries: Instrument Development and Validation.&#x201D; <italic>The Electronic Library</italic>, 29(4): 523-537.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Joo</surname>
              <given-names>S.</given-names>
            </name>
            <name>
              <surname>Lee</surname>
              <given-names>J.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Measuring the Usability of Academic Digital Libraries: Instrument Development and Validation.&#x201D;</article-title>
          <source>The Electronic Library</source>
          <year>2011</year>
          <volume>29</volume>
          <issue>4</issue>
          <fpage>523</fpage>
          <lpage>537</lpage>
          <pub-id pub-id-type="doi">10.1108/02640471111156777</pub-id>
        </element-citation>
      </ref>
      <ref id="r011">
        <element-citation publication-type="journal">
          <annotation>
            <p>Marchionini, G., C. Plaisant, and A. Komlodi. 1998. &#x201C; Interfaces and Tools for the Library of Congress National Digital Library Program. &#x201D; <italic>Information Processing &#x26; Management</italic>, 34(5): 535-555.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Marchionini</surname>
              <given-names>G.</given-names>
            </name>
            <name>
              <surname>Plaisant</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Komlodi</surname>
              <given-names>A.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C; Interfaces and Tools for the Library of Congress National Digital Library Program. &#x201D;</article-title>
          <source>Information Processing &#x26; Management</source>
          <year>1998</year>
          <volume>34</volume>
          <issue>5</issue>
          <fpage>535</fpage>
          <lpage>555</lpage>
          <pub-id pub-id-type="doi">10.1016/S0306-4573(98)00020-X</pub-id>
        </element-citation>
      </ref>
      <ref id="r012">
        <element-citation publication-type="journal">
          <annotation>
            <p>Matusiak, K. K. 2012. &#x201C; Perceptions of Usability and Usefulness of Digital Libraries.&#x201D; <italic>The International Journal of Humanities and Arts Computing</italic>, 6(1-2): 133-147.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Matusiak</surname>
              <given-names>K. K</given-names>
            </name>
          </person-group>
          <article-title>&#x201C; Perceptions of Usability and Usefulness of Digital Libraries.&#x201D;</article-title>
          <source>The International Journal of Humanities and Arts Computing</source>
          <year>2012</year>
          <volume>6</volume>
          <issue>1-2</issue>
          <fpage>133</fpage>
          <lpage>147</lpage>
          <pub-id pub-id-type="doi">10.3366/ijhac.2012.0044</pub-id>
        </element-citation>
      </ref>
      <ref id="r013">
        <element-citation publication-type="book">
          <annotation>
            <p>Nielson. J. 1993. <italic>Usability Engineering</italic>. Academic Press: Cambridge.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Nielson.</surname>
              <given-names>J</given-names>
            </name>
          </person-group>
          <source>Usability Engineering</source>
          <year>1993</year>
          <publisher-loc>Cambridge</publisher-loc>
          <publisher-name>Academic Press</publisher-name>
        </element-citation>
      </ref>
      <ref id="r014">
        <element-citation publication-type="journal">
          <annotation>
            <p>Noh, Y. 2010. &#x201C;A Study on Developing Evaluation Criteria for Electronic Resources in Evaluation Indicators of Libraries.&#x201D; <italic>Journal of Academic Librarianship</italic>, 36(1): 41-52.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Noh</surname>
              <given-names>Y</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;A Study on Developing Evaluation Criteria for Electronic Resources in Evaluation Indicators of Libraries.&#x201D;</article-title>
          <source>Journal of Academic Librarianship</source>
          <year>2010</year>
          <volume>36</volume>
          <issue>1</issue>
          <fpage>41</fpage>
          <lpage>52</lpage>
          <pub-id pub-id-type="doi">10.1016/j.acalib.2009.11.005</pub-id>
        </element-citation>
      </ref>
      <ref id="r015">
        <element-citation publication-type="journal">
          <annotation>
            <p>Saracevic, T. and L. Covi. 2000. &#x201C;Challenges for Digital Library Evaluation.&#x201D; <italic>Proceedings of the American Society for Information Science</italic>, 37: 341-350.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Saracevic</surname>
              <given-names>T.</given-names>
            </name>
            <name>
              <surname>Covi</surname>
              <given-names>L.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Challenges for Digital Library Evaluation.&#x201D;</article-title>
          <source>Proceedings of the American Society for Information Science</source>
          <year>2000</year>
          <volume>37</volume>
          <fpage>341</fpage>
          <lpage>350</lpage>
        </element-citation>
      </ref>
      <ref id="r016">
        <element-citation publication-type="confproc">
          <annotation>
            <p>Saracevic, T. 2004. Evaluation of Digital Libraries: An Overview. Presented at the DELOS Workshop on the Evaluation of Digital Libraries. [cited 2014.2.14]. -&#x3E; need cited date. &#x3C;http://www.scils.rutgers.edu/-tefko/DL_evaluation_Delos.pdf&#x3E;.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Saracevic</surname>
              <given-names>T.</given-names>
            </name>
          </person-group>
          <source>Evaluation of Digital Libraries: An Overview</source>
          <conf-name>Presented at the DELOS Workshop on the Evaluation of Digital Libraries</conf-name>
          <year>2004</year>
        </element-citation>
      </ref>
      <ref id="r017">
        <element-citation publication-type="confproc">
          <annotation>
            <p>Tsakonas, G., S. Kapidakis, and C. Papatheodorou. 2004. Evaluation of User Interaction in Digital Libraries. In M. Agosti, N. Fuhr (eds.) Notes of the DELOS WP7 Workshop on the Evaluation of Digital Libraries, Padua, Italy.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Tsakonas</surname>
              <given-names>G.</given-names>
            </name>
            <name>
              <surname>Kapidakis</surname>
              <given-names>S.</given-names>
            </name>
            <name>
              <surname>Papatheodorou</surname>
              <given-names>C.</given-names>
            </name>
            <name>
              <surname>Agosti</surname>
              <given-names>M.</given-names>
            </name>
            <name>
              <surname>Fuhr</surname>
              <given-names>N.</given-names>
            </name>
          </person-group>
          <source>Evaluation of User Interaction in Digital Libraries. In M. Agosti, N. Fuhr (eds.)</source>
          <conf-name>Notes of the DELOS WP7 Workshop on the Evaluation of Digital Libraries</conf-name>
          <year>2004</year>
        </element-citation>
      </ref>
      <ref id="r018">
        <element-citation publication-type="journal">
          <annotation>
            <p>Van House, N. A., M. H. Butler, V. Ogle, and L. Schiff. 1996. &#x201C;User-centered Iterative Design for Digital Libraries: The Cypress Experience.&#x201D; <italic>D-Lib Magazine</italic>, 2.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Van House</surname>
              <given-names>N. A.</given-names>
            </name>
            <name>
              <surname>Butler</surname>
              <given-names>M. H.</given-names>
            </name>
            <name>
              <surname>Ogle</surname>
              <given-names>V.</given-names>
            </name>
            <name>
              <surname>Schiff.</surname>
              <given-names>L.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;User-centered Iterative Design for Digital Libraries: The Cypress Experience.&#x201D;</article-title>
          <source>D-Lib Magazine</source>
          <year>1996</year>
          <volume>2</volume>
        </element-citation>
      </ref>
      <ref id="r019">
        <element-citation publication-type="journal">
          <annotation>
            <p>Xie, I. 2006. &#x201C;Evaluation of Digital Libraries: Criteria and Problems from Users&#x2019; Perspectives.&#x201D; <italic>Library &#x26; Information Science Research</italic>, 28(3): 433-452.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Xie</surname>
              <given-names>I</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Evaluation of Digital Libraries: Criteria and Problems from Users&#x2019; Perspectives.&#x201D;</article-title>
          <source>Library &#x26; Information Science Research</source>
          <year>2006</year>
          <volume>28</volume>
          <issue>3</issue>
          <fpage>433</fpage>
          <lpage>452</lpage>
          <pub-id pub-id-type="doi">10.1016/j.lisr.2006.06.002</pub-id>
        </element-citation>
      </ref>
      <ref id="r020">
        <element-citation publication-type="journal">
          <annotation>
            <p>Xie, I. 2008. &#x201C;Users&#x2019; Evaluation of Digital Libraries: Their Uses, Their Criteria, and Their Assessment.&#x201D; <italic>Information Processing &#x26; Management</italic>, 44(3): 1346-1373.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Xie</surname>
              <given-names>I</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Users&#x2019; Evaluation of Digital Libraries: Their Uses, Their Criteria, and Their Assessment.&#x201D;</article-title>
          <source>Information Processing &#x26; Management</source>
          <year>2008</year>
          <volume>44</volume>
          <issue>3</issue>
          <fpage>1346</fpage>
          <lpage>1373</lpage>
          <pub-id pub-id-type="doi">10.1016/j.ipm.2007.10.003</pub-id>
        </element-citation>
      </ref>
      <ref id="r021">
        <element-citation publication-type="journal">
          <annotation>
            <p>Ward, J.L. and S. Hiller. 2005. &#x201C;Usability Testing, Interface Design, and Portals.&#x201D; <italic>Journal of Library Administration</italic>, 43(1): 155-171.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Ward</surname>
              <given-names>J.L.</given-names>
            </name>
            <name>
              <surname>Hiller</surname>
              <given-names>S.</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Usability Testing, Interface Design, and Portals.&#x201D;</article-title>
          <source>Journal of Library Administration</source>
          <year>2005</year>
          <volume>43</volume>
          <issue>1</issue>
          <fpage>155</fpage>
          <lpage>171</lpage>
          <pub-id pub-id-type="doi">10.1300/J111v43n01_10</pub-id>
        </element-citation>
      </ref>
      <ref id="r022">
        <element-citation publication-type="journal">
          <annotation>
            <p>Zhang, Y. 2010. &#x201C;Developing a Holistic Model for Digital Library Evaluation.&#x201D; <italic>Journal of the American Society for Information Science and Technology</italic>, 61(1): 88-110.</p>
          </annotation>
          <person-group>
            <name>
              <surname>Zhang</surname>
              <given-names>Y</given-names>
            </name>
          </person-group>
          <article-title>&#x201C;Developing a Holistic Model for Digital Library Evaluation.&#x201D;</article-title>
          <source>Journal of the American Society for Information Science and Technology</source>
          <year>2010</year>
          <volume>61</volume>
          <issue>1</issue>
          <fpage>88</fpage>
          <lpage>110</lpage>
          <pub-id pub-id-type="doi">10.1002/asi.21220</pub-id>
        </element-citation>
      </ref>
    </ref-list>
  </back>
</article>
