/
Call for Tenders: PISA 2024 Innovative Domain Assessment Call for Tenders: PISA 2024 Innovative Domain Assessment

Call for Tenders: PISA 2024 Innovative Domain Assessment - PowerPoint Presentation

emery
emery . @emery
Follow
74 views
Uploaded On 2023-11-07

Call for Tenders: PISA 2024 Innovative Domain Assessment - PPT Presentation

Learning in the Digital World Webinar for interested bidders and QampA A few points to get started This presentation will be made available via the PISA website We will take some technical questions after the presentation but we will most likely not be able to answer all questions duri ID: 1029923

pisa core data task core pisa task data assessment oecd learning field test contractor scoring design main trial digital

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Call for Tenders: PISA 2024 Innovative D..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Call for Tenders: PISA 2024 Innovative Domain AssessmentLearning in the Digital WorldWebinar for interested bidders and Q&A

2. A few points to get started:This presentation will be made available via the PISA website.We will take some technical questions after the presentation but we will most likely not be able to answer all questions during the seminar so we will provide a Q&A recap to all participants after the seminar.All information provided in this presentation and during the webinar is preliminary. The OECD reserves the right to make changes to the information provided. The final specifications for Core 4 will be released in the form of Terms of Reference via the eSourcing portal in February.

3. Inform interested bidders about development work of PISA 2024 Innovative Domain assessment:Give a clear idea of what the Contractor for Core 4: Learning in the Digital World assessment development will be expected to do, next steps for bidding and OECD procurement procedures.Trigger discussion on plans and scope of work:Clarify questions and identify critical issues.Confirm with interested parties their intention to bid:After this seminar we will ask participating organisations to confirm if they will like to be invited to bid for the PISA 2024 Call for Tender: Core 4.Objectives of the seminar

4. Contract awardSelectionContractor selection based on evaluation criteria;Bidders should closely stick to what is requested in Terms of Reference.Contract informationContract duration: 6 years (2020-2026);Contract type: fixed price lump sum.

5. Tentative procurement schedule: Core 4TimeframeActivityFeb-2020Call for Tender: Core 4 publishedFeb-2020 to April-2020Call for Tender (opening period: 6 weeks after launch)Apr-May 2020Evaluation of BidsMay-Jun 2020Negotiation of Contract

6. Content: Cognitive assessment of Science, Reading and Mathematics Literacy and Innovative Domain (PISA 2024: Learning in the Digital World) and Optional Assessment (PISA 2024: Foreign Language);Questionnaires for students, school principals, optional for parents and teachers as well as optional components for student questionnaire.Population: 15-year olds; PISA 2018: 79 countries/economies, PISA 2021: >85 countries/economies, PISA 2024: growing;Test administered to 6 300 students plus additional sample of 1 650 students for optional assessment per country/economy (PISA 2015: 540 000 students).A brief introduction to PISA

7. Structure of PISA 2024 Call for Tenders

8. Context of this tender (Core 4)Develop and design the instrument for the PISA 2024 innovative domain assessment (Learning in the Digital World). Undertake data analysis of data collected in cognitive laboratories, pilot studies, Field Trial and Main StudyDeliver results to the OECD via a database Contract period: 2020-2026

9. Learning in the digital world defined: “the capacity to learn about complex phenomena and solve problems using digital resources, effectively regulating their learning processes with these resources”. The assessment aims to develop not only measures of achievement, but also gather information on how students regulate their learning experiences in digital environments. The assessment will present students with complex learning and problem solving tasks in authentic learning environments. These environments require students to apply strong self-regulatory skills and inquiry-based learning practices to search for and use information, create models for inquiry and design algorithmic solutions. Description of assessment

10. Competency model

11. Inquiry-based learning practices Mastering these practices allows learners to be active users of digital tools. Measured through traditional response items (e.g. MCQ) and process data.Information search and evaluation:Processes related to searching for and identifying relevant information and evaluating the trustworthiness of information found in digital resources. Students who can ‘seek and evaluate information’ can:Identify the information they need and search strategically in order to find it;Find and reconcile information from different sources of information/data;Distinguish irrelevant and relevant information within a source; Evaluate the integrity and reliability of different information sources.Model-based inquiry:One way individuals can learn with computers is through model-based inquiry, which involves designing and conducting investigations, analysing data and learning by building, testing and revising models. Students who have mastered ‘model-based inquiry’ can:Identify patterns based on observations and exploration;Raise hypotheses that can explain the observed phenomenon;Design investigations that allow them to collect informative data and evaluate hypotheses;Analyse available data to extract correct relationships between variables;Critique and revise hypotheses based on results;Synthesise findings;Represent a phenomenon using mathematical, lingual and computational tools;Use models to explain data, make predictions and answer questions;Analyse, evaluate and improve on given models.

12. Inquiry-based learning practices Design algorithmic solutions:An algorithm is a sequence of precise step-by-step instructions that represents one approach to solve a problem. Deciding which algorithm to use or design to solve a given problem requires a combination of thinking skills. These thinking skills include abstraction, generalisation and decomposition of a problem. Developing an algorithm also requires iterative and systematic debugging. Students who can design algorithmic solutions can:Apply an algorithmic concept to a specific design task;Decompose problems into parts that can be solved independently, using algorithmic control structures such as conditionals;Use abstraction to develop, modify or implement an algorithm to accomplish a task;Modify a given algorithm to adapt to changes in the task structure or goalUnderstand what a certain programme does after having received; training on the concepts included in a program;Debug a given program;Combine, recycle and integrate (parts of) algorithms to design solutions for new problems.Mastering these practices allows learners to be active users of digital tools. Measured through traditional response items (e.g. MCQ) and process data.

13. Self-regulationSelf-regulating one’s own learning can ensure more successful use of digital tools and in turn, increase one’s capacity to solve problems and learn in complex environments.Measured through process data and questions.Performance monitoring:Students that actively monitor their performance when presented with a problem assess gaps in their existing understanding and skills, seek feedback on performance and then evaluate the extent to which they met their learning goal(s) in order to improve next time. Performance monitoring is an iterative process that should occur throughout the learning experience. Students who successfully engage in performance monitoring can:Correctly assess their ability to complete the task successfully;Actively test their ideas and seek feedback on their performance;Correctly evaluate the extent to which they have successfully achieved a task;Identify the need for help or assistance when performance is not satisfactory.Execution and adaptation of strategiesStudents who adopt a strategic approach can effectively navigate through complex learning tasks. Good strategies all involve a coherent sequence of actions that build upon each other in a logical manner. Learners must also be able to adapt their chosen strategies as needed. Students who can monitor their performance can:Adopt sequences of actions that are coherent (each action builds on previous actions);Adopt sequences of actions that are productive (positively related to performance);Make an effective use of their time by prioritising actions;Adapt or attempt a new strategy after receiving information that the current strategy is not successful;

14. Self-regulationSelf-regulating one’s own learning can ensure more successful use of digital tools and in turn, increase one’s capacity to solve problems and learn in complex environments.Measured through process data and questionsRegulation of effort and engagement:To effectively learn in digital environments, students need to actively regulate their affective states, effort and engagement. Students must be able to stay engaged and committed to working through complex learning tasks, especially when experiencing difficulty to succeed. Students who can effectively regulate their effort and engagement can:Invest effort in learning by exploring the learning space;Continue to work on the task until its successful completion or until time runs out;Strive to find a solution even after negative feedback, managing frustration;Progress following the rules of the game rather than gaming the system.

15. Task prototype examples

16. Example task for model-based inquiry“The Memory Lab”: Task interface

17. Example task for model-based inquiry“The Memory Lab”: Two simulations

18. Example task for model-based inquiry“The Memory Lab”: Qualitative matrix

19. Example task for model-based inquiry“The Memory Lab”: Hypothesis builder

20. Example task for model-based inquiry“The Memory Lab”: Check findings

21. Example task for information search & evaluation“Zoogle search”: Task interface

22. Example task for design algorithmic solutions“Programming Karel”: Task interface

23. Example task for design algorithmic solutions“Programming Karel”: Tutorial

24. Example task for design algorithmic solutions“Programming Karel”: Tutorial

25. Example task for design algorithmic solutions“Programming Karel”: Learning activities

26. Example task for design algorithmic solutions“Programming Karel”: Evaluation of prior knowledge

27. PISA 2021 platform based on Electron frameworkHTML5 and EcmaScript 6 are available to use (+ canvas support)Automatic scoring rules embedded in itemsFlash not supported in current platformBuilt-in capacity to collect log informationContractor may need to develop methodologies for more extensive use of log data (scoring and intelligent feedback)For translation reasons, units must be developed following certain guidelines (e.g. avoid text within images).System supports right-to-left languages (Arabic and Hebrew)PISA Platform CapabilitiesUse current web technology files (HTML, CSS and JavaScript)Core 1 to provide templates and training for unit design

28. Core 1Survey design, development and implementationCore 4Instrument development for the Learning in the Digital World assessmentDevelop and provide Core 4 with unit structure template and guidelines for authoring processSupport Core 4 engineers/programmers during authoring processProvide advice and support to Core 4 for developing, validating and testing interactive components of the tasks and scoring methods of all itemsShare initial item pool with PISA member countries for review and provide Core 4 with their feedbackResponsible for back-end operations of authoring processProvide Core 4 with all data collected during Field Trial and Main StudySend national data files (including LDW) to participating countriesDay-to-day management of project and coordination of PISA International ContractorsOverall coordination and smooth running of Contractors' schedules Work with Core 1 to test/validate items in PISA computer platform (one unit by November 2020 and for all units before Field Trial and Main Study)Work with Core 1 to ensure collection and use of process data can be supported and delivered by PISA platformWork with Core 1 to package initial item pool in suitable format and deliver to Core 1 for country reviewAttend monthly Contractor Calls between International Contractors and OECD SecretariatProject Lead to liaise with International Survey Director (Core 1) on work related to Core 4Cooperate with timelines agreed on by Core 1 and international contractors Core 1 and Core 4 Collaboration

29. Task 1: Conceptual framework reviewEnsure the Learning in the Digital World assessment closely aligns with the conceptual framework.ProcessThe conceptual framework will inform the design and development of assessment instruments, and vice versa, through a collaborative process between the chosen Contractor and the OECD Secretariat. Development of the framework is managed by the OECD Secretariat, in collaboration with the Expert Group, and with feedback/input from participating countries.Framework drafts:Second draft of the framework concepts upon award of contract First full draft of the framework in September 2020Second full draft of the framework in March 2021

30. Task 1: Conceptual framework reviewClosely align the Learning in the Digital World assessment and the conceptual framework.DeliverablesTimelineReview/provide feedback on draft versions of the conceptual framework to ensure synergy between framework and instrument developmentOngoing

31. Task 2: Instrument developmentDevelop test items and associated scoring materials for the innovative domain assessment, in a suitable format for integration in the PISA platform, and submit the instruments for country review.ProcessDesign and develop test items and scoring procedures that can capture information about students’:Capacity to engage in relevant thinking practices identified in the framework (this needs to result in a comparable scale)Self-regulation processes when working through complex, open-ended tasks (experimental work – conducted with expert group)Ensure test items are cross-culturally, cross-linguistically and cross-nationally valid.Share the initial pool of test items with participating countries for their review.Finalise the item pool for Field Trial and Main Study by reviewing data results.

32. Task 2: Instrument developmentDevelop test items and associated scoring materials for the innovative domain assessment, in a suitable format for integration in the PISA platform, and submit the instruments for country review.DeliverablesTimelineDevelop one full test unit (storyboard format is OK) and deliver to OECD Sept. 2020Develop all test units and package/deliver to Core 1 for country reviewSept. 2021Perform content alignment study with all test units and provide report to OECD of resultsNov. 2021Using country review feedback, select and finalise initial item pool for Field Trial with Expert Group and OECD. Deliver to Core 1Dec. 2021Review Field Trial results and propose Main Study item selection to PGBOct. 2023

33. Task 3: Authoring and platform compatibilityThe front-end authoring process and compatibility tests to ensure the assessment can be fully integrated into the PISA platform.ProcessComplete the front-end authoring work for the 2024 PISA innovative domain assessment:Initial design of the content and structure of unitProgramming graphic and textual content displayed on screen and of available interactions using HTML, CSS and JavascriptProgramming automated feedback and scoring rules for each itemTest the integration of items within the PISA platform with guidelines and support from Core 1 throughout the development period.Ensure contents of units are protected.

34. Task 3: Authoring and platform compatibilityThe front-end authoring process and compatibility tests to ensure the assessment can be fully integrated into the PISA platform.DeliverablesTimelineTest integration of first test unit into PISA platform with Core 1Nov. 2020Test that all units are fully compatible and functional in PISA platformOngoingUndertake any correction needed in units and conduct further tests with Core 1 in preparation of Main Study Administration in the chosen PISA platformJan. 2024Provide OECD with files for all units that enable the reproduction of units on the OECD webpageMarch 2024

35. Task 4: Cognitive laboratories and pilot studiesTo inform item development, small-scale validation studies will be undertaken.ProcessConduct two rounds of cognitive laboratories to:Assess students’ capacities to work on the open environment.gauge the extent to which assessment units address the intended cognitive and self-regulatory processes.Conduct two pilot studies before the finalisation of the initial item pool for the Field Trial. Approximately 200 students from each country (1 country in first round and 3 countries in second round) will participate. Contractor is responsible for 1 country.Analyse the data and provide report to OECD.

36. Task 4: Cognitive laboratories and pilot studiesTo inform item development, small-scale validation studies will be undertaken.DeliverablesTimelineConduct first round of cognitive laboratories in Contractor’s country of choiceAug.-Oct. 2020Administer first round of pilot testing (4 test units) in one country Dec. 2020Produce a cognitive laboratory protocol document and conduct training sessions to support administrators of cog. labsMay 2021Conduct second round of cognitive laboratories in three countries (Contractor responsible for one) May-August 2021Administer second round of pilot testing (all units) in one country and support administration in two other countriesJuly-Sept. 2021Analyse transcripts from cognitive laboratories and provide OECD with a reportAfter both roundsAnalyse results of pilot studies and provide OECD with a reportAfter both rounds

37. Task 5: Test design planDevelopment of the test design (Field Test and Main Study)ProcessPISA 2024 Main Study assessment has a duration of two hours, broken down in hour-long sequences. Some students will take the innovative domain test for two hours, whereas others for only one hour.For Field Trial, produce minimum of six 30-minute clusters of assessment material (i.e. 3 hours of assessment material) given that only units included in the Field Trial can be implemented in Main Study.The Main Study will include four clusters of assessment material for the Learning in the Digital World assessment (i.e. 2 hours of assessment material).Country review may result in units needing to be modified, redesigned or dismissed altogether. Consider producing a larger number of units for country review than is planned for administration in the Field Trial.

38. Task 5: Test design planDevelopment of the test design (Field Test and Main Study)DeliverablesTimelineSubmit initial test design plan for Field Trial to OECD for approval. Seek input from Core 1 on test designJan. 2021Produce test design plan for Main Study in collaboration with Core 1 and submit to OECDOct. 2023

39. Task 6: Item scoring proceduresDevelopment of rigorous, internationally comparable scoring procedures for the assessment.ProcessScoring procedures will include methods for parsing log data and transforming them into observable indicators.Responsibilities of Contractor in this task depend on if the assessment uses fully automated scoring or if it requires a combination of automated and human scoring.Fully-automated scoring: produce and revise scoring methods based on expert inputs and data analyses.Combination of automated and human scoring: develop scoring guides and training material (revise based on country input and data analyses), prepare and conduct training workshops and assist national coders in scoring test items.

40. Task 6: Item scoring proceduresDevelopment of rigorous, internationally comparable scoring procedures for the assessment.DeliverablesTimelineDevelop, validate and test scoring methods of any open-ended items, including automatic scoring rulesOngoingSubmit Field Trial Coder Training materials (in case human scoring is included)Nov. 2022Carry out coder training workshops for participating countries’ national centres during International Coder training meetings (organised by Core 1, in case human scoring is included)Field Trial Coder Training: Jan. 2023Submit Main Study Coder Training materials (in case human scoring is included)Nov. 2023Respond to queries from countries during scoring process and revise scoring guides and training materials based on country input and data analysis After Field Trial

41. Task 7: Data AnalysisData analysis of the data collected in the Field Trial and Main Study for the PISA 2024 Learning in the Digital World assessment (both from the test and questionnaire items).ProcessConduct analyses of the item scores and process data from the Field Trial and Main Study. Goals of data analysis and reporting will be informed by framework.Prepare an analysis plan that includes:What scales and indices will be built to generate scoresPlans for parsing and analysing log file dataAnalytical and psychometric techniquesDevelop and submit a Field Trial analysis report and Main Study item selection proposal to the PISA Governing Board.Produce and provide the OECD an international database, which will include all derived variables, scales and indices, as well as a complete set of log-file data for secondary analysis.

42. Task 7: Data AnalysisPerforming data analysis of process data from the Field Trial and Main Study, developing indicators and scores.DeliverablesTimelinePrepare and submit a data analysis plan with the OECD to the PGBMar. 2021Analyse Field Trial data according to analysis plan and provide OECD a report developed with Expert GroupOct. 2023Provide OECD with Field Trial datasetOct. 2023Conduct analyses of Learning in the Digital World Main Study, in accordance with analysis planOct. 2024 – Oct. 2025Develop achievement level descriptors and conduct standard setting study (deliver results to OECD)Dec. 2024Deliver national data files, containing scales and indicators, to the OECD and Core 1TBD with Core 1Deliver fully documented database to OECD in three batchesApril, June & Aug. 2025Contribute to the sections or chapters relevant for the methodology of their work in PISA 2024 Technical ReportMar. 2025Provide statistical and technical support to OECD, design and provide basic descriptive tables and review international report2026

43. Task 8:Project managementProject management, coordination and liaison with OECD and other PISA International Contractors provision of technical advice and progress reports.ProcessCollaborate with Core 1 to guarantee the successful development and integration of the innovative domain assessment.Respect timelines agreed upon by all International Contractors and Core 1.Nominate a Project Lead that will coordinate the assessment development work and liaise with the International Survey Director (Core 1), other International Contractors and OECD Secretariat.Optimal team:Project LeadTest developer with knowledge of learning sciences and embedded assessmentProgrammer (IT)PsychometricianSomeone with experience training human codersParticipate in 7 face-to-face meetings during the contract period 2020-2026.Support the translation process.

44. Task 8:Project managementProject management, coordination and liaison with OECD and other PISA International Contractors provision of technical advice and progress reports.DeliverablesTimelineSubmit monthly progress reports (in agreed upon format) to Core 1 and bi-monthly calls with OECDOngoingPropose, with Core 1, a management plan detailing integration of the innovative domain assessment with timelineJan. 2021Contribute to the consolidated timeline for PISA 2024 (to be drafted by Core 1)Jun. 2021Continued collaboration with Core 1 throughout development process to test and validate items can be supported and delivered by chosen PISA platformOngoingLiaise with linguistic verification team and National Centres to ensure proposed linguistic translations do not influence meaning or purpose of itemsOngoingOne team member to attend 5 Expert Group meetings and 2 contractor meetings2020-2026

45. Student and school questionnaire(optional)DeliverablesTimelineDevelop 10-minute module, with questionnaire scales defined in framework March 2021Carry out small-scale validation studies (e.g., cognitive laboratories)March 2021

46. Technical Bid ProposalFinancial Bid Proposal70-80 pages maximum (not including appendices)Organised by tasks (8 tasks total)Structured bids with specific questions for each taskMain elements of technical bid:Initial ideas on type of tasks and measurement tools that will be used1-2 examples of prototype tasksIntegration strategy into PISA platformStrategies for data analysis, indicator development and scoringProject management plan and team compositionFixed-price contractDetailed budget breakdown by task, including respective expenditures on each task Financial template (Excel) providedIn EURContract/work period: May 2020 to 2026

47. Evaluation CriteriaOverall project approachApproach and management of data analysisProject managementFinancial evaluation of the allocation of the fixed budget40%20%15%25%

48. Q&A Session1. Will the stringent linguistic equivalence standards that apply for the literacy domains also be applicable for the innovative domain?PISA strives to achieve the same standards for the innovative domains that are applied to the main domains (e.g. Science, Reading and Mathematics). However, there is an understanding among the participating countries that the innovative domain is more experimental than the main domains, which can affect the possibility of achieving the same standards. The PISA innovative domain aims to achieve linguistic equivalence, and the same processes of linguistic verification are applied. 2. Will there be a provision for a preliminary screening of translatability and cultural relevance of the Learning in the Digital World instruments?Yes, countries will have the opportunity to screen for cultural relevance at several points during the development process: countries will be asked to review the units that are developed for the pilot studies in 2020 and 2021, and will undertake a final review of all the field-trial instruments. Translatability checks will be implemented by the contractor responsible for translations before the Field Trial, following standard processes and timelines managed by the Core 1 contractor. 3. Where can we find the financial conditions and regulations for this call (e.g., are there prescribed rates and overheads)?All of the information that we can provide on this will be detailed in the Call for Tenders. 4. Who is in the lead of creating the tasks, the expert group or the consortium? The Contractor, or consortium of contractors (in the case of sub-contracting agreements), is responsible for creating the assessment tasks. The Expert Group’s role is to define the constructs for which the Core 4 Contractor will build relevant tasks. This will be a collaborative project between the Expert Group and the Contractor, or consortium of contractors. The Expert Group will convene bi-annually, in which the contractor or consortium will be invited to present their work. The experts will provide the Contractor with feedback and orientation on the instrument development. It is important that the Core 4 Contractor integrates the experts’ feedback during successive iterations of the assessment tasks.

49. Q&A Session5. Is the contractor also responsible for analyzing the data from the PISA 2024 actual data (so the >600,000 students).The Core 4 Contractor is responsible for completing the data analysis related to the PISA 2024 Learning in the Digital World assessment of the Field Trial and Main Study. The data analysis includes: Developing the international scale;Develop, in collaboration with the Expert Group, achievement-level descriptors for the scale; Derive indicators on the regulation construct, as defined in the analysis plan; andProduce the international database. It is important to note that the sample of students taking the innovative domain is much smaller than 600 000. In past cycles, approximately 30% of PISA students in each participating country took the innovative domain, and some countries either could not participate (countries that take the paper-based version of PISA) or decided not to participate. 6. Why 2026 as an end date when the test is in 2024?The contract period will span from 2020 to 2026 because the Core 4 Contractor is expected to receive the data collected in the Main Study by the Core 1 Contractor in 2025. The Core 4 Contractor will then analyse the data from the Main Study related to the Learning in the Digital World assessment and create the international database, which will all be completed and delivered to the OECD by the end of 2025. The OECD Secretariat is responsible for drafting the PISA 2024 international report, which will most likely occur in the first half of 2026. To support the preparation of this report, the Core 4 Contractor may need to respond to queries related to technical aspects.

50. Q&A Session7. Could you speak to the computer/hardware specs and requirements as well as the bandwidth specifications to which vendors will be building?The Call for Tender specifies a set of minimum requirements for the platform that we expect the chosen PISA platform to respect. These minimum requirements are outlined in the webinar presentation on slide 28. The PISA 2024 assessment will be delivered both online and offline. There will be multiple methods to deliver the tests offline, one of which will be the traditional method of using USB keys. In terms of bandwidth, the specific information on the online and offline delivery will be defined by the chosen Core 1 Contractor. The Core 1 Contractor will develop and provide to the Core 4 Contractor detailed specifications on platform capabilities and delivery modes, and a training to facilitate authoring that is consistent with requirements for integration and translation. 8. For the cognitive laboratories, is Core 4 in charge of the field work?Yes, the Core 4 Contractor will be responsible for the field work in one country of its choice for both rounds. The first round of cognitive laboratories will occur at some point between August and October 2020 in one country of the Contractor’s choice. The second round of cognitive laboratories will be conducted at some point between May and August 2021 in three countries. The Core 4 Contractor will only be responsible for one of those countries (it is possible for the Core 4 Contractor to hold the cognitive laboratories in the same country as the first round). The OECD Secretariat can provide support in this process, if necessary, for example seeking interest for implementation support by national teams. Bidders should provide information on whether their organisation has the capacity to undertake this kind of field work in the Technical Bid proposal. 9. Will the presentation and Q&A session be provided after the webinar? Yes, the presentation is available on the Homepage of the PISA website.

51. For any further questions or to be invited to bid for Core 4, please email us at the addresses below:PISA Innovative Domains team:Mario Piacentini: Mario.piacentini@oecd.orgNatalie Foster: Natalie.foster@oecd.orgMegan Welsh: Megan.welsh@oecd.orgEmma Linsenmayer: Emma.linsenmayer@oecd.orgProcurement:Giorgio Cerniglia: Giorgio.cerniglia@oecd.org