Application deadline: Friday 3rd July, 2020.
Sydney Health Ethics is seeking applicants to a PhD research scholarship, as part of an ARC-funded project entitled, ‘TARGeT – Theories of Autonomy in Reproductive Genetic Technologies.’ This scholarship aims to financially assist a PhD student researching theoretical bioethics, applied ethics, or other sub-fields of philosophy, as it relates to reproductive autonomy.
Those who are interested are encouraged to read the information provided at the links below, and to contact Prof Ainsley Newson (ainsley.newson at Sydney.edu.au) or Dr. Kathryn MacKay (kathryn.mackay at Sydney.edu.au).
You can find information about the project and this PhD opportunity at the TARGeT blog: https://targetautonomy.blogspot.com
For more information and to apply:
If a candidate wishes to discuss an extension on the application deadline we encourage them to contact us.
Applications will be considered until June 15, 2020, or until the position is filled.
Artificial Intelligence is perveding every aspect of our life: it is present in decisions made in healthcare, transportation, finances, education, and governmental level, just to mention a few players. With it, explanation for high-stakes decision making (XAI) is at the center of current debates, receiving much (and deserved) attention from industry and academy alike. AI is essentially a sociotechnical system, where a decision maker interacts with various sources of information and decision-support tools, a process whose quality should be assessed in terms of the final, aggregated outcome —the quality of the decision— rather than assessing only the quality of the decision-support tool in isolation (e.g., in terms of its predictive accuracy and standalone precision). It is therefore important to develop tools that explain their predictions in meaningful terms, a property rarely matched by AI systems available in the market today. The explanation problem for a decision-support system can be understood as a trade-off between what algorithms can safely ignore and what meaningful information should absolutely be included to make an informed decision. Thus, the problem of XAI is intertwined with epistemic and normative problems, such as trustworthiness (what sort of knowledge we have and what can safely be ignored), comprehensibility (human meaningfulness of the explanations), and accountability (humans keeping the ultimate responsibility for the decision). In this context, several epistemological, ethical, and legal questions emerge, such as: what is the logic for a successful XAI? how can we ensure reliability and trust? which ethical concerns emerge in the context of (un)successful explanations? Is current XAI complying with regulations? The successful candidate will work in the intersections of epistemological and ethical issues, with a strong emphasis on the normative aspects of explanatory AI. Different ethical frameworks and principles will be studied to understand the implications of XAI and the role of different stakeholders in high-stake decision makings (developers, users, individuals affected by system’s actions, legal experts, etc.). Case studies will also be focus of this project, as they will engage relevant stakeholders to capture adherence to social, legal, ethical norms, and, ultimately, human responsibility.
This PhD project will be developed within the EU Horizon2020 SoBigData++. SoBigData++ strives to deliver a distributed, Pan-European, multi-disciplinary research infrastructure for big social data analytics, coupled with the consolidation of a cross-disciplinary European research community, aimed at using social mining and big data to understand the complexity of our contemporary, globally-interconnected society. SoBigData++ is set to advance on such ambitious tasks thanks to SoBigData, the predecessor project that started this construction in 2015. Becoming an advanced community, SoBigData++ will strengthen its tools and services to empower researchers and innovators through a platform for the design and execution of large-scale social mining experiments. It will be open to users with diverse background, accessible on project cloud (aligned with EOSC) and also exploiting supercomputing facilities. Pushing the FAIR principles further, SoBigData++ will render social mining experiments more easily designed, adjusted and repeatable by domain experts that are not data scientists. SoBigData++ will move forward from a starting community of pioneers to a wide and diverse scientific movement, capable of empowering the next generation of responsible social data scientists, engaged in the grand societal challenges laid out in its exploratories: Societal Debates and Online Misinformation, Sustainable Cities for Citizens, Demography, Economics & Finance 2.0, Migration Studies, Sport Data Science, Social Impact of Artificial Intelligence and Explainable Machine Learning. SoBigData++ will advance from the awareness of ethical and legal challenges to concrete tools that operationalise ethics with value-sensitive design, incorporating values and norms for privacy protection, fairness, transparency and pluralism. SoBigData++ will deliver an accelerator of data-driven innovation that facilitates the collaboration with industry to develop joint pilot projects, and will consolidate an RI ready for the ESFRI Roadmap and sustained by a SoBigData Association.
• Master’s degree or equivalent in philosophy or similar discipline.
• Candidates with interests in analytical philosophy (e.g. ethics of algorithms, ethics of AI, ethics of technology, philosophy of action) and strong affinity (or degree in) philosophy of science, epistemology, philosophy of technology, philosophy of engineering and computer science are strongly encouraged to apply.
• Strong interests in interdisciplnary research, principally with computer scientists.
• Excellent command of written and spoken English.
• Excellent communication skills and interested in translating research ideas and findings for the benefit of non-academic stakeholders (e.g. managers and policymakers).
• The ability to work independently and as a team player.
CONDITIONS OF EMPLOYMENT
Fixed-term contract: 4 years.
TU Delft offers PhD-candidates a 4-year contract, with an official go/no go progress assessment after one year. Salary and benefits are in accordance with the Collective Labour Agreement for Dutch Universities, increasing from € 2325 per month in the first year to € 2972 in the fourth year. As a PhD candidate you will be enrolled in the TU Delft Graduate School. The TU Delft Graduate School provides an inspiring research environment with an excellent team of supervisors, academic staff and a mentor. The Doctoral Education Programme is aimed at developing your transferable, discipline-related and research skills.
The TU Delft offers a customisable compensation package, discounts on health insurance and sport memberships, and a monthly work costs contribution. Flexible work schedules can be arranged. For international applicants we offer the Coming to Delft Service and Partner Career Advice to assist you with your relocation.
Technische Universiteit Delft
Delft University of Technology (TU Delft) is a multifaceted institution offering education and carrying out research in the technical sciences at an internationally recognised level. Education, research and design are strongly oriented towards applicability. TU Delft develops technologies for future generations, focusing on sustainability, safety and economic vitality. At TU Delft you will work in an environment where technical sciences and society converge. TU Delft comprises eight faculties, unique laboratories, research institutes and schools.
Faculty Technology, Policy and Management
The Faculty of Technology, Policy and Management (TPM) makes an important contribution to solving the complex technical and social challenges that we face as a society. Challenges such as energy, climate, mobility, IT, water and cyber security. This requires a multidisciplinary approach that goes beyond technology. Our education and research are therefore at the intersection of technology, society and management. We combine insights from the engineering sciences with those from the humanities and social sciences. We develop robust models and designs, are internationally oriented, and have extensive networks in science and practice.
For information about this vacancy, you can contact Juan M. Durán, email: j.m.duran[at]tudelft.nl.
For information about the selection procedure, please contact Mrs. Anita van VIanen, HR Advisor, email: vacature-tbm[at]tudelft.nl.
When you are interested in this position, please include in your application: (a) CV, (b) motivation letter including names of two references, (c) list of publications in a single pdf entitled “TPM20.038_YourLastname.pdf”. Send your application to vacature-tbm[at]tudelft.nl. Applications will be considered until June 15, or until the position is filled. Due to the Covid 19 measurement a remotely start is possible.
Friday May 15th, 2020, 12.00 – 12.40 BST.
From the Nuffield Council on Bioethics.
The fourth in our current series of COVID-19-related webinars will explore the importance of transparency, and of public involvement and deliberation in research and policy where public interests and values are at stake. We ask whether the Prime Minister’s commitment to ‘maximum transparency’ is at least a step in the right direction. This meeting will be of interest to those in policy roles as well as interested academics, public and third sector groups, industry and members of the public.
A recording and summary of the webinar will be available on our website shortly afterwards.
Further details and registration here.
The Nuffield Council on Bioethics has created an online resource responding to COVID-19, which includes policy briefings and guidance.
Webinars are also available to watch any time, discussing topics that include:
COVID-19 and policy making: the role of public engagement and deliberation
Tackling the challenges of conducting COVID-19 research ethically in lower income settings
Beyond the exit strategy: ethical uses of data-driven technology in the fight against COVID-19
Ethics in the research response to COVID-19
Around the world, there are troubling examples of minority groups being under extra pressure during the current pandemic. As well as the demonisation of some groups as supposed vectors of disease, we have seen governments using emergency powers in ways that may disadvantage minorities and advance populist and nationalist agendas, and the deepening of pre-existing division and discrimination. Join us as Blavatnik School researchers discuss the protection of minority rights in the current situation, considering the political, legal and ethical frameworks.
This event is part of the Alfred Landecker Programme.
Please note: This event will take place online via Zoom and be streamed live.
- Jonathan Wolff, Alfred Landecker Professor of Values and Public Policy
- Dapo Akande, Professor of International Public Law
- Maya Tudor, Associate Professor of Government and Public Policy
- Federica D’Alessandra, Executive Director, Programme on International Peace and Security, Oxford Institute for Ethics, Law and Armed Conflict
Thursday May 7th, 2020, 4.00 – 5.30 pm.
New St Cross Special Ethics Seminar, jointly organised by the Oxford Uehiro Centre and the https://www.weh.ox.ac.uk/
Speaker: Professor Arthur Schafer, Centre for Professional and Applied Ethics, University of Manitoba.
Abstract: In June of 2016 the Canadian Parliament passed legislation (Bill-14) legalizing MAiD: medical assistance in dying. Subject to various restrictions, both mercy killing and medically assisted suicide are now legal in Canada. The contours of the Canadian euthanasia debate will be described, with special focus on the ethical issues that remain most controversial. Two salient Canadian Supreme Court decisions will be analysed: Rodriguez (1993) and Carter (2015), as well as more recent constitutional challenges. The presentation will conclude by outlining the further legal changes that are likely to (or that should) occur in the reasonably near future.
Register in advance for this webinar: https://zoom.us/webinar/register/WN_Z1wEPMJHT36-omb9-HEC4Q – after registering, you will receive a confirmation email containing information about joining the webinar.
April 29th, 2020, 07:00 PM London time.
Essex Explores – questions, insights and answers wherever you are.
Professor Wayne Martin
Director, The Essex Autonomy Project
School of Philosophy and Art History
The coronavirus pandemic has placed tremendous pressure on emergency services, and there has been a much-publicised shortage of critical life-saving medical resources. Frontline medical personnel have therefore had to struggle not only with the clinical challenges of care, but with fraught ethical dilemmas in situations where demand swamps supply. Hospital ethics committees and public health bodies around the world have had to develop policies and procedures to allocate scarce resources. How, if at all, can philosophical ethics (and philosophers who work in the academic study of ethical principles) help to navigate the challenges of triage under conditions of pandemic?
Further details and registration here.
In these unprecedented times we need to rapidly learn about the emerging evidence on COVID-19. This evidence includes not only what is being gathered, analysed and reported by researchers but also evidence relating to the approaches different countries and inter-governmental organisations have taken to respond to the pandemic.
Webinars will be targeted to members of the academic community in the UK and beyond, the public health community in Scotland and further afield, and key decision-makers from a range of sectors.
These webinars are hosted via Zoom webinars, and live streamed to YouTube. Videos are available after the events. See links to register for future events (when available) and view recordings of previous events via YouTube below.
Applications close: April 9th, 2020.
Applications are invited for 14 PhD positions (“Early Stage Researchers”) to be funded by the Marie-Skłodowska-Curie Innovative Training Network ‘MATER – Innovative Training Network in Female Reproductive Care’ within the Horizon 2020 Programme of the European Commission.
MATER is a consortium of 10 high-profile universities and companies with outstanding expertise in female reproductive genomics and medicine, located in Estonia, Finland, Belgium, Sweden, Spain, Poland and Italy.
The aim of the MATER consortium is to train a new generation of 14 creative, entrepreneurial, innovative and ethically sensitive early-stage researchers (ESRs) in the field of female reproductive care through a European Joint Doctoral Programme. Research aims of the project are to contribute solving some of the most pressing challenges in female reproductive care and their ethical dilemmas by targeting the delicate issues like infertility and pregnancy complications and devising novel ideas to treat them, avoid miscarriages and implement genetic technologies in prenatal diagnostics. Innovation aims of the project are helping students to recognize and understand the ethical issues of their work, as well to prepare them for converting the novel knowledge and ideas into social and economic benefits. Each ESR will carry out their doctoral training jointly at two degree-awarding universities, while receiving consortium-wide practical and theoretical research-related and innovation training provided by all partners.
Further details here.