Indicator Search

Select Logic Model Components and Categories to filter content. To select multiple filters per box, hold the 'control' key while selecting. To export the indicators click the check box to select the item(s) and click the button to export.

Number Returned: 17
Operations
ID Indicator Category Sub Category Logic Model Component Short Definition Long Definition Data Requirements Data Sources Disaggregation Frequency of Data Collection Purposes Issues and Challenges Related Indicators Sample Topics and Questions for Data Collection Instruments Resources Pages in the Guide Published Year Adaptation of Indicator for Specific Purpose (Illustrative Examples) Data Type(s) Intended Use Other Relevant Information Last Updated Date Indicator Snapshots
14 Number of individuals served by a KM output, by type Captures the number of people that a KM output directly influences, disaggregated by type of output This indicator captures the number of people that a KM output directly influences. The type of KM output should be specified so data can be aggregated or disaggregated as needed. For instance, the number can represent people attending a meeting, seminar, or conference as well as those joining in a virtual learning or networking activity. This number could also represent the subscribers or recipients of a product, service, or publication. The indicator can be used to measure various kinds of outputs. Quantitative data from evidence that intended users—such as recipients, subscribers, participants, or learners—have received, registered, or participated in a particular KM output, whether in person or virtually. Mail (postal or email), contact, or subscriber lists; registration or attendance records; and other administrative records and databases Quarterly These data chart the initial reach of the KM output and identify which users were addressed. This is one of the most basic indicators for measuring reach. It is a simple way to gauge initial communication of and access to the KM output. The various ways data can be stratified can help profile the user. Supplementary information collected could include demographic and job-related characteristics of these individuals, such as the country/region where they work and their organizational affiliation, job function or type, gender, and level of education. Additional information may also include the type of dissemination or promotion and the communication channels used, such as print, in-person, electronic (either online or offline), or other media. How many times have you accessed the [Web product] in the past 3 months? (Select one.) o 0 times o 1-5 times o 6-10 times o 11-15 times o 16-20 times o 20+ times o Never heard of it 34-35 2013 • Number of registered learners in an eLearning service • Number of recipients who received a copy of a handbook via initial distribution • Number of registered participants in a training seminar • Number of fans and followers on social media accounts Count Wednesday, September 6, 2017 The Global Newborn Health Conference—held April 14-18, 2013 in Johannesburg, South Africa, and sponsored by the Maternal and Child Health Integrated Program—counted among its participants 70 senior government health officials from 50 countries. Since January 2012, MEASURE Evaluation hosted 29 webinars that covered seven topics related to the monitoring and evaluation of population, health, and nutrition programs. The webinars attracted 1,228 participants.
15 Number of copies or instances of a KM output distributed to existing lists, by type of output Captures the number and type of a KM output that has been distributed This indicator captures the number and type (such as document copies or email announcements) of a KM output that has been distributed. Use of existing lists indicates that this is an initial and direct distribution or dissemination from the original developer of the output, such as an organization or project, Distribution of the output can be by mail, in person, online, or via any other medium. Electronic distribution of copies includes various file formats, such as PDF, TXT, PPT, or HTML. Quantitative data on the number of hard/paper or electronic copies distributed by language, types/formats of the product, how/where the copies were distributed, and dates distributed. Administrative records Creating a database designed specifically to track distribution/dissemination numbers is helpful. Quarterly This is a direct and simple measurement of the quantity of an output (such as an email announcement or handbook) distributed. This indicator contributes to measuring reach. Due to the rapid advancement and growing availability of information and communication technologies in recent years, many organizations and projects have been shifting the scope and focus of their dissemination efforts from printing and distributing paper copies to using electronic channels. Electronic copies can be distributed to intended or potential users by email as attachments or as web links. While electronic distribution can potentially reach more people for a lower cost, poor internet access and low storage capacity on a mobile device or computer may limit the reach of distribution efforts. Measuring the types of outputs and channels used can help determine the efficiency and effectiveness of current distribution channels used. 35-36 2013 • Number of copies of an implementation guide distributed • Number of notifications emailed announcing a new issue of an online journal Count Wednesday, September 6, 2017 During the four-day Global Newborn Health Conference, the Twitter hashtag #Newborn2013 had an estimated reach of 2,979,300. It generated approximately 10,423,631 impressions and was tweeted over 1686 times by more than 650 contributors. Since 2007, 500,000 copies of the Family Planning: A Global Handbook for Providers have been distributed. The handbook is available in nine languages: English, French, Portuguese, Russian, Spanish, Swahili, Romanian, Hindi, and Farsi. Recently the handbook was made into an online resource, with digital downloads available for mobile devices. As a result, the number requests for paper copies have steadily decreased. Having the data and the timeline of dissemination helped explain why the changes that occurred were due to interest in the new distribution channel, not the lack of interest in the handbook.
16 Number of delivery methods used to disseminate content, by type Captures the number and type of delivery methods used to disseminate or promote content and messages This indicator captures the number and type of delivery methods used to disseminate or promote content and messages across a KM project or specific activity. It can apply to a wide range of methods, such as online sources, web tools, print copies, and electronic offline devices. Examples of electronic offline delivery devices include flash drives, CD-ROMs, netbooks, tablets, eReaders, mobile phone apps, and portable audio devices. Quantitative data on the number of media types used and number of copies of product distributed (see Indicator 15) through each method and different formats for each method, such as ePub and Kindle for eReaders and Android and iPhone for phone apps. Administrative records Creating a spreadsheet or list designed specifically to track distribution/dissemination numbers is helpful. Quarterly Organizations and projects implementing KM activities need to assess the effectiveness of the method mix by disaggregating monitoring data by delivery method; over time they may decide to add/reduce the types of media according to these findings. The strategy to select certain delivery method over others and/or to offer information through multiple methods should be based on thorough understanding of users. How did you first learn about the [Web product]? (Select one.) o Announcement (e.g., email, paper) o Colleague's referral o Internet search o Conference/meeting o Promotional materials (e.g., fact sheet, flyer) o Link from another website o Social media (e.g., Facebook, Twitter) o Other, please specify __________ 36 2013 Count Wednesday, September 6, 2017 MEASURE Evaluation uses 14 communication methods to share news, publications, presentations, events and conversations, including website, print and electronically produced publications, Monitor e-newsletter, Evaluate blog, SlideShare, YouTube, Twitter, Facebook, Flickr, LinkedIn, webinars, Knowledge Gateway, Google+, and Podcasts. Content from the Global Newborn Health Conference was distributed by at least 9 delivery methods, including live presentations, printed materials, Twitter, Facebook, websites, webcasts, email, Scribd digital document library, and blogs/blog posts.
17 Number of media mentions resulting from promotion Captures how many times a KM output has been mentioned in various forms of news media coverage This indicator captures how many times a KM output has been mentioned in various forms of news media coverage, such as print and online news sources, LISTSERVs, blogs/blog posts, television, or radio. A media mention usually indicates, to some degree, that the original output is recognized, credible, and considered authoritative. Quantitative data on the number of mentions in or on print, online, social media, television, or radio, and the numbers of people reached by each news media outlet, if available. Administrative records, media outlets, reports from clipping services, media monitoring services, and internet monitoring tools, such as Google Alerts and Yahoo Pipes Quarterly This indicator measures the media coverage of a KM output, or a group of KM outputs, and tracks the coverage to gauge the effect of reach, promotion, and outreach efforts. The media coverage can be about the KM output itself or about the issue or content featured in the KM output. News media coverage measures whether intermediaries thought their audiences would be interested and consider the issue important. Since the news media often help set political and policy agendas, an indicator of news media coverage can suggest whether policy makers might be influenced to give an issue greater priority. A news media strategy is a road map for reaching and influencing policy makers indirectly. An advantage of a media mention can be the potentially large population reached through this secondary/in-direct method of dissemination. However, the total impact may not be great if the mention is brief and most of the people listening/ watching are not interested. For web-based products, services, publications, and content, a web monitoring tool, such as Google Alerts or Yahoo Pipes, provides a quick and easy way to set up specific queries and monitor mentions in online media. A number of media monitoring services and software also cover print, television, social media, and other types of broadcasting. Several challenges may impede using a media coverage service. First, these services charge a fee, which may be beyond your project budget. Second, it can be difficult to capture all instances of media coverage, especially in broadcasts. A solution may be to organize staff when a news-making publication comes out, so they can monitor various new media outlets for coverage of the story. However, this means you need to have enough human resources to put toward this task. 36-37 2013 Count Wednesday, September 6, 2017 From July 2012 to June 2013, the K4Health project had 52 media mentions from promotion, meeting the annual project target of 50. Many of the media mentions were by various blogs managed by other global health organizations, such as the USAID Impact Blog; by news or announcements websites, such as News Medical; and by digital health, such as the Kaiser Daily Global Health Policy Report.
18 Number of times a KM output is reprinted/reproduced/replicated by recipients Collects the number of specific cases an organization or independent body—other than the one that originally authored, funded, produced, or sponsored a KM output—decides to use its own resources to copy or excerpt all or part of the KM output This indicator collects the number of specific cases in which an organization or independent body—other than the one that originally authored, funded, produced, or sponsored a KM output—decides to use its own resources to copy or excerpt all or part of the KM output. “Reprint” is a term specific to publications and other print resources, while “reproduction” can apply to products and services, and “replication” can refer to approaches and techniques. Thus, the number refers not only to print copies, but also to online copies in any online medium or even any other KM events or approaches. Quantitative data from requests for approval or permission to reprint, reproduce, or replicate, which indicate the number of items produced and, if applicable, which parts of those documents; and/or copies or other evidence of reprinting, reproduction, or replication. Administrative records, letters, emails, communication of request and acknowledgment, or receipts and online pages that track use and downloads of web-based products, such as open source content management systems Quarterly Reprints, reproductions, and replicated approaches demonstrate demand for a particular KM output and extend the reach of the output beyond what was originally feasible. An added value of this indicator is that a desire to reprint, reproduce, or replicate suggests an independent judgment that the KM output is useful and of high quality. A limitation of this indicator is that the original publishers or developers have to rely on what is reported or sent to them or what they happen to come across after reprinting and reproduction. It is not possible to know with certainty the extent of reprinting and reproduction, as some re-publishers think they would not receive permission to reprint, so they do not tell the original publisher their materials are being used. Also, it may be difficult to find out the extent of dissemination, the identity of the recipients, or the use of the reprint. These limitations apply to both online and print media. 37-38 2013 Count Wednesday, September 6, 2017 OpenAid is a website platform designed and built by the USAID-funded the Knowledge for Health Project to help small non-governmental organizations and international development projects quickly create cost-effective, program-focused websites (http://drupal.org/project/openaid). OpenAid was released in July 2012. As of June 2013, 60 different sites were using the OpenAid platform.
19 Number of file downloads Captures the number of times a file is downloaded from a website to a user’s own electronic storage medium This indicator captures the number of times a file is downloaded or content is transferred from a website to a user’s own electronic storage medium. Quantitative data from web server log files, web analytics, and/or content management system records Web server log files; web analytics software, such as WebTrends, Google Analytics, Piwik; content management system, such as Drupal and Joomla Quarterly Tracking the number of file downloads provides insight into which information products and topics website visitors most frequently save to their own electronic storage medium. In addition to tracking general trends, file download data can also help indicate how well promotional efforts and campaigns have reached online users. There are two ways to count downloads: by examining server logs or web analytics. Server logs are produced automatically on a typical web server, and can help staff distinguish between partial and completed downloads. However, content and communications staff may need assistance from hosting company or internal IT staff to access and understand server logs. A web analytics interface such as Google Analytics or the WordPress analytics plug-in uses tags and cookies to track web traffic and can be configured to track downloads. Once set up, this method requires less specialized IT knowledge than accessing or analyzing server log files. Analytics programs also often allow users to filter download data—for example, to see show the geographic location of users who download a specific file. While analytics programs are easier to use, they still require a certain level of expertise, a learning curve should be expected. For more information about Web analytics, see Appendix 3 on p.83. 38-39 2013 Count Wednesday, September 6, 2017 In the first quarter after launching social media channels, document downloads on the ICT for Ag community website (ictforag.org) increased by just under fivefold. The film In It to Save Lives: Scaling Up Voluntary Medical Male Circumcision (VMMC) for HIV Prevention for Maximum Public Health Impact (http://www.aidstar-one.com/focus_areas/prevention/resources/vmmc)—produced by AIDSTAR-One, funded by USAID, and managed by John Snow, Inc.—received a total of 3,789 plays between June 1, 2011 and June 30, 2012. Over 690 downloads were associated with the AIDSTAR-One VMMC materials. The film was downloaded from the AIDSTAR-One website 224 times, the discussion guide was downloaded 121 times, and the transcript was downloaded 123 times. The film was downloaded from 36 countries; the top five countries were United States, Kenya, Uganda, South Africa, and United Kingdom.
20 Total number of pageviews Captures the total number of times a page is viewed by a visitor This indicator captures the total number of times a page is viewed by a visitor. Pageviews are measured when a page’s tracking code is executed on a website. Quantitative data from web analytics Web analytics software, such as Google Analytics, Piwik, or WebTrends Quarterly Pageviews are a typical general measure of how much a website is used. In the early days of the Internet, use was measured in "hits." However, “hits" are no longer a meaningful measurement. A "hit" is a call to a web server for a specific action or element, but a modern web page is much more complex and can involve anywhere from one to hundreds of individual "hits." With pageviews, the trend is important, not the raw numbers. Watching a specific page's performance can be useful. For example, a spike in views of a specific page can indicate the success of a promotion. Web traffic varies greatly, depending on the size and scope of a website. If your site serves a small community of practice, do not compare your pageview count to a site that serves a broader audience. For more information about Web analytics, see Appendix 3 on p.83. 40 2013 Count Wednesday, September 6, 2017 The GNH Conference website (http://www.newborn2013.com/) was first launched in January 2013. It generated 29,562 pageviews up until May 5, 2013. Between June 1, 2011 and June 30, 2012, the materials page of the film In It to Save Lives: Scaling Up Voluntary Medical Male Circumcision for HIV Prevention for Maximum Public Health Impact (http://www.aidstar-one.com/focus_areas/prevention/resources/vmmc/resource_packet) generated a total of 5,666 pageviews. The VMMC landing page, with the embedded video, generated 1,204 pageviews from 89 countries. About 20 percent of all pageviews were visits from Africa. Since MEASURE Evaluation started using SlideShare in June 2008, the project’s 229 slides have received a total 174,162 pageviews. Most of the pageviews came from the United States (35,731), Bangladesh (4,975), Ethiopia (4,460), Nigeria (2,930), Kenya (2,859), and India (2,831).
21 Total number of page visits Captures the total number of “visits” or individual interactions with a website This indicator captures the total number of “visits” or individual interactions with a website. According to the Web Analytics Association’s web analytics definitions, a “visit” is an individual interaction with a website. If the individual leaves the website or does not take another action—typically requesting additional pageviews—on the site within a specified time, interval web analytics software considers the visit ended. Quantitative data from web analytics Web analytics software such as Google Analytics, Piwik, or WebTrends Quarterly Visits represent the number of times users have gone to and then left a website. A visit can range from a few seconds to several hours, and a single visitor can log multiple visits to a page, even in the course of a single day. Different web analytics programs define a visit differently. In general, a visit begins when a visitor views the first page of a website and ends when a criterion set by the analytics program is met, such as if the visitor does not click anything for 30 minutes. In addition to representing the volume of traffic to a website, visit numbers are used to compute other common indicators, such as average visit duration and average page depth—the average number of pages viewed during a visit to a website. Some organizations may find it useful to further qualify this indicator as it relates to key intended users, such as visitors from specific countries or regions. As with Pageviews, the trend of total visits is more important than the raw numbers. And, while the total number of visits can provide insight into the total number of times people consulted a site, it cannot distinguish between repeat and one-time visitors. For more information about Web analytics, see Appendix 3 on p.83. 40-41 2013 Count Wednesday, September 6, 2017 Since launching in February 2011, visits to the ICTforAg community website (ictforag.org) have grown steadily from 200 visits per month up to 1,000 visits per month, peaking at over 2,000 visits in January 2013. During the month of April 2012, the K4Health website (www.k4health.org) received 60,371 visits, an average of 2,012 per day. In the 2012 calendar year, 22% (40,250) of visits to K4Health toolkits came from USAID family planning priority countries.
22 Number of links to web products from other websites Captures the number of links, or URLs, located on another website that directs users to the publisher’s website This indicator captures the number of links, or URLs, located on another website that directs users to the publisher’s website. The referring website creates and maintains these links. Quantitative data from web analytics, webmaster tools, search engine optimization (SEO) tools Web analytics software, such as Google Analytics, Piwik, or Web Trends; webmaster reports, such as those from Google Webmaster Tools, Bing Webmaster Tools, or Alexa.com; SEO tools such as Majestic SEO or Open Site Explorer Quarterly The number of links and variety of referring sources directing traffic to an organization’s online information products indicate both reach and authority. If reputable websites link to an organization’s website or its online resources, one can reasonably argue that the destination resource has recognized the publisher’s authority on a given topic. Some search engines can provide information on what other websites link to a specific site. For example, searching in Google for "www.mysite.com" returns a list of URLs that provide links to www.mysite.com. However, data from search engines are far from comprehensive, as most search engines make only partial data available in order to maintain the confidentiality of their ranking algorithms and to deter spammers. A more comprehensive view may be available through webmaster tools provided by services like Google or Bing. Like webmaster tools, SEO tools directed at online marketing professionals can provide similar link data. However, most SEO tools cost in the range of $75 to $150 per month, which is out of reach for many programs and small organizations. For more information about Web analytics, see Appendix 3 on p.83. 41-42 2013 Count Wednesday, September 6, 2017 As of January 2013, 5,917 sources provided referral links to web pages on www.k4health.org. As of August 2013, 940 websites link to www.measure evaluation.org.
23 Number of people who made a comment or contribution Captures active sharing of programmatic experience and knowledge among people participating in KM outputs This indicator captures active sharing of programmatic experience and knowledge among people participating in KM outputs—usually those hosted online, such as professional network groups, communities of practice, forums, webinars, or social media blogs or sites like Facebook or LinkedIn. The online format makes data collection easy by digitally storing comments and contributions such as postings or materials uploaded into a platform. The number of contributors indicates how many have interacted with the other users and have shared their personal experiences, knowledge resources, or opinions with others. This count helps the organizer to assess the depth of user engagement. Quantitative data from sources that provide the number of participants, electronic records of postings from participants, identification of product or issue under discussion, characteristics of participants such as country/region where they work, organizational affiliation, job function or type, gender, and level of education. Qualitative data from content analyses of comments and contributions that provide more detailed information about user characteristics, types, themes of contributions Administrative records of comments posted via LISTSERVs, discussion groups, communities of practice, or social media tools Quarterly Counting attendance is a valid measure, but it does not indicate the degree of engagement. The total number of people in attendance includes people who contribute significantly, those who share a little, and those who listen without contributing, otherwise known as lurkers. Lurkers are usually the majority, especially in virtual settings. Direct user interactions indicate interest in the subject matter, which in turn speaks to the relevance of the KM output. In addition, contributions suggest that the users feel encouraged and comfortable contributing; thus, they have developed a sense of community and belonging in a particular group, which may stimulate further knowledge sharing. However, the indicator does not usually suggest how the user will use the information/product/output in the future or whether the information will continue to spread through the professional networks of the attendees and contributors. For more information about Web analytics, see Appendix 5 on p.92. 42-43 2013 Count Wednesday, September 6, 2017 During the LeaderNet webinar on blended learning, 275 participants logged on from 56 countries, sharing 308 posts in English, Spanish, and French. As of June 2013, there were 7,924 subscriptions to 11 communities of practice managed by MEASURE Evaluation. During the project’s fifth year (July 2012 – June 2013), 273 subscribers posted new insights and knowledge to the community LISTSERVs. In August 2013, MEASURE Evaluation shared a post on LinkedIn about the availability of M&E materials for trainers by MEASURE Evaluation. The post received 15 shares, 33 comments, and 16 likes in the Monitoring and Evaluation Professionals LinkedIn group. A blog post containing the same information received 21 Twitter shares and 16 Facebook shares.
55 Number of people trained in adaptive practicess Refers to the number of staff trained in iterative approaches to learning and adapting This indicator refers to the number of staff trained in iterative approaches to learning and adapting to improve projects, programs, or initiatives. Quantitative data from administrative records or reports that provide the number of participants, characteristics of participants, gender, and other relevant information Administrative records and reports After specific activities This indicator tracks the initial reach of adaptive practices. It is a simple way to establish a foundation of staff trained in programmatic flexibility and change. Although a project, program, or initiative may include adaptive management approaches in its work plan, it does not necessarily reflect the use of those sessions in decision making. Although training sessions may be provided, it does not guarantee that the participants were able to internalize the learning and facilitate or use the training materials in the future. 2017 Count Wednesday, December 13, 2017
56 Percentage of target staff reporting an improvement in capacity to use adaptive practices Refers to the percentage of target staff reporting an improvement in capacity (knowledge, skills, or abilities) to use adaptive practices This indicator refers to the percentage of target staff reporting an improvement in capacity (knowledge, skills, or abilities) to use adaptive practices for the management of a project, program, or initiative as a results of participating in training or other activities aimed at building capacity in adaptive management. Target staff may include partners. Self-reporting, pre- and post-evaluations, or follow-up surveys should be conducted to determine the extent to which there was an improvement in awareness, understanding, or capacity in iterative approaches to learning and adapting. Quantitative data from pre- and post-tests using survey questions and Likert scales to determine capacity to use adaptive practices and follow-up assessments at three and/or six months to determine knowledge retention; qualitative data can provide greater insight into target user capacity Pre- and post-tests, follow-up surveys Quarterly, semiannually, or after specific activities This indicator can be used to monitor changes in capacity (awareness, knowledge, and skills) in adaptive practices over time (before training/activity and after training/activity). It is a simple way to establish a foundation of staff trained in programmatic flexibility and change. Although a project, program, or initiative may include adaptive management approaches in their work plan, it does not necessarily reflect the use of those sessions in decision making. Self-reported data may be biased and may not empirically represent the context or practice. 2017 Proportion Wednesday, December 13, 2017
57 Number of approaches, methods, tools, or events implemented for reflection and other adaptive practices Refers to the number of adaptive practices (approaches, methods, tools, or events) used to facilitate the adaptive management of a project, program, or initiative This indicator refers to the number of specific adaptive practices, including approaches, methods, tools, or events used to facilitate the adaptive management of a project, program, or initiative. This may include the number of in-person learning events, after-action reviews, lesson-learned workshops, communities of practice, new technologies that facilitate increased ease and frequency of interaction, or other related iterative approaches to learning and adapting. Although this indicator measures the number of approaches, this should not suggest that it is better to use more approaches. The intent is to measure the intentionally selected approaches used to facilitate adaptive management. Quantitative or qualitative data from programmatic records, or self-report of number of adaptive practices conducted, by type; qualitative data to provide greater insight into actual use by staff Administrative records and reports, self-report surveys Quarterly, semiannually This indicator reports the actual implementation of the planned adaptive practices that were identified to be used in the project, program, or initiative. Although a project, program, or initiative has conducted adaptive management approaches, and followed through on their identified work plan activities, it does not necessarily mean that the sessions were of high quality or contributed to programmatic improvements. Projects, programs, and initiatives may find it more useful to measure the proportion of staff using adaptive approaches, however, because the number staff may change overtime (expand and contract) it may be difficult to measure over time. 2017 Count Wednesday, December 13, 2017
58 Number of sessions or activities that include analysis of and/or reflection on monitoring data Refers to the number of sessions or activities focused on reflection and analysis of monitoring data from a project, program, or initiative to inform performance and adjustments This indicator refers to the number of sessions or activities focused on reflection and analysis of monitoring data from a project, program, or initiative to inform performance and adjustments. This may include modifying results reviews, data-quality assessments, or other monitoring and evaluating activities implemented for accountability and to inform decision making to include more learning and reflection. Quantitative data from administrative records or reports that provide the number of sessions or activities implemented with a focus on reflection and analysis of monitoring data Administrative records and reports Quarterly, semiannually, or after specific activities This indicator reports on the implementation of planned adaptive practices identified to be used in the project, program, or initiative that specifically used routine or other monitoring data collected by the project, program, or initiative. Although a project, program, or initiative has conducted adaptive management approaches, and followed through on their identified work plan activities, it does not necessarily mean that the sessions were of high quality or contributed to programmatic improvements. Quality can be assessed through user satisfaction surveys (see indicators 24 to 28); actions taken to make programmatic improvements can be assessed in an internal assessment. 2017 Count Wednesday, December 13, 2017
59 Number of actionable recommendations identified or collected to inform project, program, or initiative performance Refers to the number of actionable recommendations to inform project, program, or initiative performance or adjustments that were collected from the use of adaptive practices This indicator refers to the number of actionable recommendations to inform project, program, or initiative performance or adjustments that were identified or collected from the use of adaptive practices or from sessions or activities focused on reflection and analysis of monitoring data. The number can be used to calculate a percentage of action taken. Quantitative and qualitative data from the review and analysis of meeting minutes, reports, and other documentation to determine how many recommendations are actionable. Administrative records and reports Quarterly, semiannually, or after specific activities This indicator reports the number of recommendations collected that are actionable. Recommendations should identify a point of contact or timeframe for its use, rather than be a general statement that provides no next steps. This indicator should help staff to reflect on whether the stated recommendations can be used, by whom, and by when. By collecting the number of actionable recommendations, a percentage of actions taken can be calculated during an internal assessment (see Act subcategory). Although a project, program, or initiative has conducted adaptive management approaches, and followed through on their identified work plan activities, it does not necessarily mean that the sessions were of high quality or contributed to programmatic improvements. Review and analysis of documentation can help determine how many recommendations are actionable, such as actions taken within the scope of the project, program, or initiative; responsibility and next steps are clearly documented; and budget and time allocated, and so on. Teams can determine how best to define "actionable" to meet the project, program, and/or initiative needs. 2017 Count Wednesday, December 13, 2017
73 Percentage of group members who have used knowledge from another group member in a given time period Measures the use of knowledge acquired through social connections This indicator measures the percentage of group members who have used knowledge acquired from another group member. Self-reported quantitative data; self-reported qualitative data. Typically, the denominator would be the number of group members reporting giving or receiving knowledge (the numerator from indicator 72), and the numerator would be the number of group members reporting having used that knowledge. Surveys; focus groups or other qualitative data for exploration and validation Quarterly, semiannually, or after specific activities The aim of this indicator is to understand the level of use of knowledge acquired through social networks. This indicator requires users to define the group of interest here —an entire organization, a department or other functional team, a community of practice, or Facebook group. Users will also need to define “knowledge” and "use of knowledge" in their particular context as well as an appropriate time period, usually monthly or quarterly, for knowledge-sharing activities. 2017 Proportion Wednesday, December 13, 2017
75 Percentage of teams or subgroups that group members have used in a given time period Measures the use of diverse (or heterogeneous) social connections This indicator measures engagement in diverse subgroup memberships, such as a department, working group, committee, community of practice, or Facebook group, and focuses on the use or value of diverse connections. Self-reported quantitative data; self-reported qualitative data. Typically, the denominator would be a group member's total number of team or subgroup memberships (the numerator from indicator 74), and the numerator would be the number of those functional teams or subgroup memberships in which the group member is engaged. Surveys; focus groups or other qualitative data for exploration and validation Quarterly, semiannually, or after specific activities Connection diversity, or heterogeneity, supports the flow of knowledge through networks. The aim of this indicator is to understand how many different functional teams or other types of subgroups each group member is engaged in, and the average level of use of these connections by the network/organization/group as a whole. This indicator requires users to define the group of interest—such as an entire organization, a department or other functional team, a community of practice, or Facebook group—as well as any subgroups. For example, an entire organization may be defined as group, with departments, committees, internal communities of practice, or roles as subgroups. Users will also need to define “knowledge” and "engagement", in their particular context, as well as an appropriate time period, usually monthly or quarterly, for knowledge-sharing activities. 2017 Proportion Wednesday, December 13, 2017