Tuesday 25 October 2022

Ethics committees shouldn't provide methodological reviews (IMHO)

I hesitate to write this blog in case I unleash a further torrent of strong opinions either way on Twitter, but I couldn’t resist. I won’t link to anyone’s tweets, as I don’t want to draw people into a conversation they don’t want to be further drawn into. Many of you will have seen a recent debate on Twitter about whether ethics committees (or IRBs) should include methodological reviews. In my opinion, they shouldn’t. Yes to improving methods and experiments, no to doing this through the pre-existing ethical review system.

Good methods aren’t an ethical issue (or aren't an ethical issue that is relevant to an ethics committee)

Why do ethics committees exist for research that is conducted on human participants (which I will focus on, as I am in a psychology department and the debate has largely centred on human research and psychology)? The answer is because psychology has a history of conducting experiments that have done actual harm to participants. The classic examples are the Stanford Prison Experiment and Milgram’s experiments on obedience. We have a clear moral and legal obligation to ensure the safety of our human participants. We need to ensure that they suffer no harm during the experiment, that they are able to consent in an informed manner, and that we have clear plans in place in relation to holding their data. We also have an obligation to ensure our research doesn’t inadvertently affect non-participants, for example when conducting research into specific groups that could lead to discrimination. Getting these things wrong could cause genuine harm to our participants and wider community and having a formal committee that reviews this is critical.

The argument put forward is that running experiments with “bad methods” is unethical, therefore should be considered by the ethics committee. The question then becomes what is unethical about running a “bad” experiment? One possibility is that it is a waste of the participant’s time. I don’t think this is an ethical issue. If there is a clear statement in the information provided to the participant that they will not benefit in any way from participating (apart from remuneration for their time) then this would seem to cover this possibility. If the participant provides informed consent knowing this to be the case, this doesn’t seem like an issue to me.

Even if it was, we then have to ask what “wasting someone’s time” means. I’m sure I could find a few psychologists who think an experiment I design is theoretically important, but if I sampled 100 people on the high street and asked them if I was wasting someone’s time doing this experiment, they may well have a very different answer. Equally, I might design a very good experiment methodologically, but the question itself might be completely pointless (e.g., does the presence of a teddy bear increase the likelihood of someone choosing a Twix compared to a Mars bar?). There are no societal norms that provide a clear benchmark here.

The last point is that there are clear ethical and legal guidelines in place that allow ethics committees to set a clear bar for the acceptance or rejection of applications. Although plausible that this could be the case for methods reviews, the same structure does not currently exist. The likely scenario is then that the bar must be set so low that it becomes essentially meaningless.

Ethics committees would struggle to assess methods

Let’s say good methods are an ethical issue that warrants consideration by the ethics committee. How then would methods get reviewed? Presumably the committee would consist of a wide range of researchers and the individual with the most expertise in a given area would be assigned to assess the methods of that application. I think this could work in a department that isn’t too methodologically diverse. For example, in my department most researchers are in some sense “cognitive psychologists”, despite the fact that some of us study memory, some language, some social interactions, and some development. There is therefore a common underlying theoretical framework and range of methods that we all might be able to assess. Indeed, we do include (e.g.,) power analyses in our ethics applications and it isn’t too onerous (although I would argue it isn’t necessary).

In more methodologically diverse departments this won’t be the case. If you are the only quantitative researcher in a department of qualitative researchers (or vice versa) then there is not enough expertise to provide an informed review. This is an issue in some departments (regardless of whether methods are reviewed by ethics committees) – for example in relation to a lack of peer support and feedback from colleagues. The problem would be exacerbated if a formal (inadequate) review process was introduced and likely alienate colleagues further.

My other worry is that by claiming methods are an ethical issue, it has the potential to draw attention away from the real ethical issues that led to the formation of ethics committees in the first place. In an ideal world where everyone has lots of time, this might not be an issue, but if one committee member happens to pay a bit more attention to the methods review and less to the information provided to the participant prior to consent, this could cause problems.

Good intention, bad policy

At this point you might be thinking “but shouldn’t we be providing peer review and support to colleagues in relation to their experiments?”. The answer to this is a very big yes. However, (1) I don’t think this should be subsumed within an ethics application and (2) I don’t think it should be formalised to the extent that your experiment can be rejected. A good department should have multiple support structures in place to provide feedback on new experiments. This can be within-lab with lab meetings, across labs with departmental research groups/interest groups, within PhD tuition with thesis advisory panels, and ad hoc with peer-to-peer conversations. Many of us also get feedback on grant applications that includes detailed feedback on our proposed experiments. The best way to encourage a given researcher to improve their experiment is to ensure your research environment has multiple mechanisms in place to provide supportive, collaborative feedback. 

Perhaps this could be achieved through a formal ethics process (and it appears some institutions manage this relatively successfully), however it seems more difficult to achieve than through face-to-face, collaborative meetings that allow a rapid back and forth of feedback and response (instead of a binary pass/fail of an ethics committee) and where no researcher is given the power to reject your proposal if they don’t find it up to standard. Granting power to a specific individual needs to be carefully considered, with further structure in place to ensure that power isn’t abused (e.g., a senior colleague blocking a more junior colleague from conducting research because they don’t agree with their methodological approach). 

The general point here, which applies to several other issues in academia, is that the best way to improve research in your institution is to focus your efforts on creating a positive, diverse, collaborative research environment where people want to do their best research (and have the time and resources to achieve this). We can’t use small procedural tweaks to fix larger institutional problems.  


Acknowledgements: Thanks to three reviewers (who will remain anonymous) for providing feedback on an early draft of this blog. You know who you are.

Wednesday 19 October 2022

How to get PhD funding in the UK

It is that time of year again. The leaves are turning golden, red, and orange (or just brown), the nights are drawing in, and there is a chill in the air. Also, potential PhD students are emailing faculty members about applying for PhD positions.

The application and funding system in the UK is varied and complex. After going through the centralised UCAS system when applying for undergraduate courses, many students are left bewildered at exactly how to apply for a PhD and how to secure funding. Here is a brief guide for those applying in the UK.

Funding landscape

Broadly, there are three (well, four if you include self-funding, but I would try to avoid that if you can) ways to get funded in the UK: (1) an advertised funded PhD supported by a grant to the supervisor, (2) a centralised departmental/university studentship, and (3) a doctoral training programme (DTP) funded by one of the major UK funding bodies. Sometimes overseas tuition fees are covered, but often they are not. The lack of studentships available to international students, coupled with EU students now paying international fees, is depressing but a topic for another day.

The way you apply for each of these options will vary, and even if you are applying solely to DTPs, the application process will vary across DTPs. Faced with all this variety it is easy to get overwhelmed, trying to read different websites across different universities, some seemingly contradicting each other. What is a potential PhD student to do?

The first approach

The best way to avoid all this confusion is to take a step back from the funding mechanisms and think about what you want to study, where you want to study, and who you want to study with. You don’t need a clear proposal, but you should have a good idea of the general area, which departments are strong in this area, and who specifically does research in this area. Try to get to the point where you can identify an area and several potential researchers who could potentially supervise you. Once you have this information, you are ready to approach individual people.

I would start with emailing a select few people you really want to work with, preferably in October (now!) so there is plenty of time to apply. 2-4 potential supervisors should be sufficient at first, and if you don’t hear back from some you can always email others. You should try to research the people you approach – know exactly what research they do and preferably read a couple of recent papers. Make sure your research interests overlap with theirs. They don’t have to exactly, but they should overlap enough, and you should be able to clearly state where this overlap lies.

Your email doesn’t want to be a long essay, but nor should be it a couple of sentences. I would start with a few sentences about your academic record (what you have studied, grades etc.), then a short statement about what your research interests are and what (general) topic you would like to pursue during your PhD. You should then make it clear how they fit with this – make it clear you know what they research and how your potential topics fits with this research. If you are relatively open to topic, I would still try to say what you would like to do (to show you have some ideas) but state you are also open to other projects. Finally, one or two sentences making it clear why you have emailed them (basically why you want to do a PhD at that department and with that supervisor) would be good. Attach your CV to this email.

The first meeting and beyond

Hopefully they will respond to you. It might be a simple “sorry, I’m not looking for new PhD students this year”, but it might be more positive. If it is, I would try to organise a Zoom meeting (or Teams meeting if you are a masochist) so you can meet them and vice versa. At this point you should be thinking about how well you connect with them – were you able to have an interesting conversation about potential projects, do they seem supportive, are they providing you with appropriate information and advice? The more you get a sense of whether you will be able to work with them for the next three years the better.

If all goes well, and you are both excited to apply for funding, this is the point when you need to think about the logistics of applying and potential funding mechanisms. Your potential supervisor should have a good idea of the funding landscape at their institution. There may be a single application, or you might have to apply for different funding programmes. If the latter, you should hopefully be able to write a single project proposal and then make small edits dependent on the specific application. Some DTPs don’t require a clear project proposal, and you don’t apply with the potential supervisor. This doesn’t negate approaching your potential supervisor first though. They should be able to offer support and advice on how to apply, increasing your chances of getting funded. In my opinion it is better to put more effort into fewer applications than applying for as many things as possible. This is particularly the case if you are writing different project proposals with different potential supervisors. If you have a strong CV, then it is the proposal and the input and feedback you get from your potential supervisor that will likely make the difference between getting funded and missing out.

General advice

Deciding what to study and who to pick as a supervisor is difficult. At some point it is a risk and you have to take a chance. However, two things are critical: (1) you have to like and be interested in the research topic and (2) you have to connect with your potential supervisor. In relation to the latter, this doesn’t mean you can joke around with them (if you can that is fine). It means you can talk openly with them about research and about your career. It means that you feel they would support you during your PhD and help you do the best science you can do. This might mean they challenge you and ask difficult questions at times. However, they should create an atmosphere where you feel you want to rise to those challenges and also feel comfortable simply saying "I don't know (yet...)". Having seen different models of supervision, there is no one-size-fits-all approach. However, I have learned to value a professional but supportive atmosphere over a purely “friendly” one. The latter can sometimes, though not always, cover over more toxic behaviour that isn’t immediately obvious. Think carefully about whether you think you as an individual will be provided with the support you need to become the best researcher you have the potential to be by the time you finish your PhD.

Wednesday 5 October 2022

Working part time

This is, by definition, a self-indulgent blogpost as it is about me. As of 1st October 2022, I will be working part time, changing from a five-day to four-day working week. Many people do this already, including many academics, so why bother writing about it? First, I want to discuss reasons for and against doing it, and why I eventually chose to switch. Second, I want to offer some potential advice based on lessons I have learnt in the process of switching.

Why go part time?

The obvious answer is to work less. Many people need to go part time because of non-work responsibilities – looking after children or supporting relatives. Some have second jobs or side-projects that they want or need to dedicate more time to. Some simply want time off. For me there was no single reason for the change. I have parenting commitments, so often need to leave work early to pick the children up from school, but both my children are at school now until 3.15pm (apart from in school holidays, which is a whole different problem and discussion about balancing parenting and work commitments during these times). I have some academically inclined ideas that are not part of my job but I would like to pursue. I would also like a bit of time away from work that doesn’t involve having to constantly think about whether the children are hungry, or bored, or over-excited, or late for some extra-curricular activity, even if it is one morning each week. The primary reason then is simply to have a bit of time that is not fully booked – whether that be work or parenting.

The pandemic led a lot of us to reconsider our work-life balance. Over a year ago I thought about going part time but decided that the best thing to do then was to put more effort into managing my time – saying no to things, realising when a specific job did not need 100% of my attention and effort etc. This helped make the decision this time around easier. I had gone through the process of cutting back as much as I could, and it had been largely successful, but I still wanted to change to four days a week. It made sense for me, at this point in my life and career, to try something different.

The positives are clear to me: less work, less stress, more time to think and reflect on both work and life. What about the negatives? The obvious one is money. This is a particular concern given the cost of living crisis that is only going to get worse in the short-to-medium term. I do not have an obvious answer to this. Finances will be stretched more than they were before, but my family is in a relatively fortunate position financially so we should be able to afford it. I appreciate I am lucky to be in this position, and many simply cannot afford to work part time. The second possible negative is perhaps the main reason many academics do not work part time: they are worried about still working a full time job but receiving less pay. This is partly why spending a year managing my time better persuaded me this was the correct decision. Ultimately though there is no reason why I should be guarding my time better on a part time vs a full time contract – we should all be guarding our time better regardless of our contracted hours. The third negative I have only just encountered is guilt. As the new academic year begins, I have less teaching than most of my colleagues and I cannot help feel some guilt about this. I feel less “in the trenches” than before. However, the stretched resources of most academic departments isn’t really my fault and I’m being paid less money, so I’ll learn to get over this. There are plenty of other positive and negative aspects to working part time, but each will differ dependent on the individual. My only conclusion here is that working part time is worth considering. Even if you decide not to, the process of thinking through the positives and negatives has the effect that working full time becomes more of an active choice than a default option.

Lessons learnt

Given I have just started working part time, I have no words of advice on how to ensure you keep your non-contracted hours work-free. I am sure this will be a challenge for me. It has taken a while to get to this point though, and it is worth considering what I did and whether that was sensible. As I have already said, I have been thinking about this for well over a year and took active steps to manage my time better before eventually committing. This is definitely worth doing, as if you can’t manage your time effectively then decreasing your contracted hours isn’t necessarily going to help. Think about what jobs you need to do, what jobs you want to do, and what jobs are not necessary and do not bring you joy. Every time you are offered a new “opportunity”, do not say yes straight away. I am now in the habit of immediately replying saying something like “this sounds interesting but I will need time to consider it. I will try to get back to you next week”. That gives me the breathing room to consider the costs and benefits and whether I really want to do it. If I am in any doubt, my default option is now to say no.

Once you have decided you want to work part time the next step is to make sure you know what jobs will be taken away from you. The good thing about academia is some of our responsibilities are very clear and concrete – a specific module, a set number of project students, a citizenship duty. Ideally, your department will have a clear workload model that shows which aspects of your job equate to the amount of time you are cutting. If not, things can be trickier, but it should still be reasonably clear what might equate to (eg.) 20% of your workload. There might be others in the department or wider university that you know who can offer advice on this. My advice is to think through 2-3 possible ways to cut your workload and decide which you would prefer.

When you feel you have a clear idea, approach your Head of Department and talk to them. You will likely need to explain (1) why you want to do this and (2) what you would like to cut from your current workload. Hopefully your Head of Department will be supportive of any choice you have made concerning reducing your contracted hours, however they do have a responsibility to balance workloads across the department with the limited resources they have available. This will almost inevitably lead to some degree of negotiation. The more prep you can do in advance of this, and the more options you have to suggest, the better. Personally, it took 2-3 meetings over a few weeks to arrive at a solution that we were both happy with. Do not immediately accept the first proposal that is presented to you, particularly if you are in a face-to-face meeting. Take your time and think clearly about whether it is a fair offer, preferably after the meeting. Think about possible changes to any proposal that would make it fairer and set these out in an email with your rationale, then set up another meeting for further discussion.

My final piece of advice is to be brave (that is a bit hyperbolic, it is only working part time after all). It took me a while to get to the point that I was willing to commit. I was not 100% sure it was the correct decision, and I am still not sure it is the correct decision. However, I did get to the point where I felt it was correct to try. I am sure there will be a time when I go back to full time work (sooner or later). I hope I can at least look back on this period of my life and appreciate the extra time I had. I also hope I use the time effectively, whatever that means.  

Wednesday 18 March 2020

Memory and The Brain Module


A few weeks ago a long discussion followed a tweet about the start of my “Memory & The Brain” module. I promised to provide a summary of the content of the module, as several people seemed to be interested. So here it is.

First, the context. This is an “advanced module” that final year undergraduates (and some Masters students) take after completing core 1st and 2nd year modules in “perception and cognition” and “brain and behaviour”. They therefore have a relatively strong background in undergraduate psychology, and the module is designed to push them beyond this core knowledge. It consists of eight 2-hour seminars. In several seminars (2, 3, 5, 6, & 7), I provide a 1-hour lecture and then the students present two key papers on the topic. The idea is for them to digest and then present the material to the other students, and to promote a discussion on the broader theoretical topic. This can often lead to stimulating discussion among the students where they learn how to critically evaluate the studies (and sometimes it can lead to awkward silences).

In each seminar I typically choose 2-3 “key papers” on the broader topic. Some of these papers I have chosen because they are clearly the key papers in the area. Some I have chosen because they are important, but also are written well, and have a clear experimental design, to ensure the students are not overly stretched. I often include my papers, not necessarily because I think they are THE key papers, but because (1) students seem to like reading and discussing lecturer’s work and (2) it allows them to understand that scientists are able to critically evaluate their own work to the same degree as others. All this means that there will undoubtedly be key papers related to the topic that aren’t covered. These are the choices we sometimes must make when teaching.

Finally, the module has now run for 4 years with relatively little change to the content (some tweaks to which papers the students present), so this is perhaps a good moment to reflect on whether anything needs changing or updating. If you have any suggestions along these lines, please feel free to email me. With all that out of the way, here is the content. Note, the content is largely taken from the online material the students have access to, in italics is my narrative/summary of each seminar and how they link together.

Module overview


Our memories make us who we are. They allow us to delve into our past and project ourselves into the future. How does the brain support something so complex, subjective and personal?

This module will explore the cognitive neuroscience of long-term memory, with a specific focus on episodic and spatial memory. We will explore this topic from a wide variety of methodologies - from traditional experimental psychology, to neuropsychology, to brain imaging, to eletrophysiological recordings.

Learning outcomes


On completing this module, the student will be able to:
  • Appreciate the complexities involved in the study of long-term memory
  • Discuss memory research at multiple scales, from individual neurons, to cortical networks, to behaviour
  • Critically appraise research related to episodic and spatial memory
  • Identify different regions of the medial temporal lobe (MTL), including subfields of the hippocampus
  • Describe the main theoretical accounts of the medial temporal lobe and hippocampus

Seminar 1 – Memory systems and declarative memory


The purpose of this lecture is to teach and/or revise what is commonly taught at the undergraduate level – primarily Squire’s taxonomy of memory, and the possible “types” of memory that have been identified by neuropsychology and neuroimaging studies. Many of the students will have covered some aspects of this before (i.e., the distinction between episodic and semantic memory). Here I want to present the material in such as way as to prepare them for the remainder of the module. In particular, I want them to think carefully about what a “type” of memory might be, and whether this is a good way of conceptualising long-term memory.

Learning outcomes


After the lecture, the student will be able to:
  • Discuss evidence for multiple memory systems
  • Understand the different sources of evidence provided by neuropsychology and functional brain imaging
  • Explain the role of the medial temporal lobes in long-term declarative memory

Key Reading


  1. Squire, L.R., & Zola-Morgan, S. (1991) The Medial Temporal Lobe Memory System, Science, 253(5026), 1380-1386.
  2. Scoville, W.B., & Milner, B. (1957) Loss of recent memory after bilateral hippocampal lesions, Journal of Neurology, Neurosurgery & Psychiatry, 20(11), 11-21.

Further Reading


  1. Chapter 7: Long-term memory systems, in Eysenck & Keane, Cognitive Psychology: A Student's handbook

Seminar 2 – Episodic and semantic memory


Here we discuss the distinction between episodic and semantic memory. We start off with evidence that seemingly provides a double dissociation between episodic and semantic memory – MTL amnesic patients vs semantic dementia patients. We then discuss how this difference may stem from that fact that episodic memory tests typically assess anterograde memory (learning of new material) whereas semantic memory tests typically assess retrograde memory (retrieval of material learnt prior to brain injury/degeneration). Because of this, we focus on research that assesses whether amnesic patients with damage to the hippocampus/MTL can learn new semantic information.

Learning outcomes


After the lecture, the student will be able to:
  • Give clear real-world examples of episodic and semantic memory
  • Discuss evidence for possible dissociations between episodic and semantic memory
  • Report the key brain regions involved in episodic and semantic memory

Key Reading


  1. Vargha-Khadem et al., (1997) Differential effects on early hippocampal pathology on episodic and semantic memory, Science, 227, 376-380.
  2. Tulving et al., (1991) Long-lasting perceptual priming and semantic learning in amnesia: a case experiment, Journal of Experimental Psychology: Learning, Memory & Cognition, 17(4), 595-617.
  3. Hodges et al., (1992) Semantic dementia. Progressive fluent aphasia with temporal lobe atrophy, Brain, 115(6), 1783-1806.
  4. Hamann & Squire, (1995) On the acquisition of new declarative knowledge in amnesia, Behavioural Neuroscience, 109(6), 1027-1044.

Further Reading


  1. Squire & Zola (1998) Episodic memory, semantic memory and amnesia, Hippocampus, 8(3), 205-211.

Seminar 3 – Recollection and familiarity


We next focus on episodic memory, and the distinction between familiarity and recollection. We cover the strengths and weaknesses of three different approaches to dissociating between these two plausibly distinct processes – the remember/know procedure, the process dissociation procedure, and signal detection theory/ROC curves. We then discuss neuropsychological data for/against this distinction.

Learning outcomes


After the lecture, the student will be able to:
  • Explain the distinction between recollection and familiarity
  • Appreciate how signal detection theory has contributed to the recollection/familiarity distinction
  • Report the key brain regions involved in recollection and familiarity

Key Reading


  1. Yonelinas (1994) Reciever-operating characteristics in recognition memory: evidence for a dual-process model, Journal of Experimental Psychology: Learning, Memory & Cognition, 20(6), 1341-1354.
  2. Bowles et al. (2010) Double dissociation of selective recollection and familiarity impairments following two different surgical treatments for temporal-lobe epilepsy, Neuropsychologia, 48(9), 2640-2647.
  3. Wais et al. (2006) The hippocampus supports both the recollection and the familiarity components of recognition memory, Neuron, 49(3), 459-466.
  4. Horner et al., (2012) A rapid, hippocampus-dependent, item-memory signal that initiates context memory in humans, Current Biology, 22(24), 2369-2374.

Further Reading:


  1. Aggleton & Brown, (1999) Episodic memory, amnesia, and the hippocampal-anterior thalamic axis, Behavioural Brain Sciences, 22(3), 425-444.
  2. Brandt et al., (2009) Impairment of recollection but not familiarity in a case of developmental amnesia, Neurocase, 15(1), 60-65.

Seminar 4 – Medial temporal lobe architecture


Prior to seminar 4, we discuss the hippocampus, perirhinal cortex, and the medial temporal lobes but students have learnt little of the underlying architecture of these regions. We therefore cover the major inputs into the perirhinal and parahippocampal cortices, the entorhinal cortex, and the trisynaptic loop. This is covered at this point, given the following seminar requires knowledge of the individual subfields of the hippocampus (in particular CA3 and DG). The emphasis of this seminar is to understand how knowledge of the underlying architecture provides clues as to what the functions of each region might be (e.g., if perirhinal cortex receives major input from ventral visual stream, it is likely to process object/item information relative to parahippocampal cortex).

The format of this seminar is slightly different from other seminars, in that I give a 1-hour lecture and then students go through a workbook of brain diagrams/images in groups, identifying key regions (this replaces student presentations).

Learning outcomes


After the lecture, the student will be able to:
  • Discuss the principal inputs into the medial temporal lobes
  • Identitify the subfields of the hippocampus
  • Explain the circuitry of the hippocampal trisynaptic loop

Key Reading


  1. Preston & Wagner, (2007) The medial temporal lobe and memory, in Kestner & Martinez (Eds) The Neurobiology of Learning and Memory, 305-337.
  2. Amaral (1999) Introduction: what is where in the medial temporal lobe? Hippocampus, 9(1), 1-6.
  3. Lavenex & Amaral (2000) Hippocampal-neocortical interaction: a hierachy of associativity, Hippocampus, 10(4), 420-430.

Further Reading


  1. Amaral & Lavenex, (2007) Hippocampal neuroanatomy, in Per Andersen et al (Eds) The Hippocampus Book, 37-109.

Seminar 5 – Pattern separation and pattern completion


We cover the two computational processes of pattern separation and pattern completion, and how these are likely supported by DG and CA3 respectively. We then cover the related concept of attractor dynamics, and how this might relate to pattern separation/completion. The key readings are human fMRI study, though we start to cover more rodent electrophysiology work from this point. We focus on how pattern separation/completion might be useful computations in relation to episodic memory.

Learning outcomes


After the lecture, the student will be able to:
  • Explain how pattern separation and pattern completion might support memory
  • Report the hippocampal subfields that support pattern separation and pattern completion
  • Discuss research in rodents and humans that provide evidence for these computations

Key Reading


  1. Horner et al., (2015) Evidence for holistic episodic recollection via hippocampal pattern completion, Nature Communications, 6(7462), 1-11.
  2. Berron et al. (2016) Strong evidence for pattern separation in the human dentate gyrus, Journal of Neuroscience, 36(29), 7569-7579.

Further Reading


  1. Wills et al. (2005) Attractor dynamics in the hippocampal represention of the local environment, Science, 308(5723), 873-876.
  2. Neunuebel & Knierim, (2014) CA3 retrieves coherent representations from degraded input: Direct evidence for CA3 pattern completion and Dentate Gyrus Pattern Separation
  3. Bakker et al., (2008) Pattern separation in the human hippocampal CA3 and dentate gyrus, Science, 319(5870), 1640-1642.
  4. Nakazawa et al., (2002) Requirement for hippocampal CA3 NMDA receptors in associative memory recall, Science, 297(5579), 211-218.
  5. Steemers et al., (2016) Hippocampal attractor dynamics predict memory-based decision making, Current Biology, 1-8.

Seminar 6 – Functional neurons in the medial temporal lobe


Prior to seminar 6, we have primarily covered fMRI and neuroimaging in humans, and as such know little about what individual neurons in the MTL do. Here we cover the major “functional neurons” in the MTL, as revealed by single-unit electrophysiology in rodents and humans – place cells, head-direction cells, grid cells, boundary/border cells, and “concept” cells. Towards the end of the seminar we discuss how these cells are responding to (e.g.,) the rodent’s current position or heading-direction, so seem not to serve an obvious “memory” function. At this point, it is simply to think about this possible disconnect – patients with MTL damage clearly show memory deficits, however individual neurons in the MTL respond to stimuli in the present (i.e., appear somewhat more “perceptual” in nature).

Learning outcomes


After the lecture, the student will be able to:
  • Report the main functional neurons in the hippocampus
  • Describe the firing characteristics of these neurons
  • Appreciate how these neurons contribute to spatial and episodic memory

Key Reading


  1. O'Keefe & Dostrovsky (1971) The hippocampus as a spatial map. Preliminary evidence from unity activity in the freely-moving rat, Brain Research, 34(1), 171-175.
  2. Hafting et al., (2004) Microstructure of a spatial map in the entorhinal cortex, Nature, 436(7052), 801-806.
  3. Quiroga et al., (2005) Invariant visual representation by single neurons in the human brain, Nature, 435(7045), 1102-1107.

Further Reading


  1. Solstad et al. (2008) Representation of geometric borders in the entorhinal cortex, Nature, 322(5909), 1865-1868.
  2. Lever et al. (2009) Boundary vector cells in the subiculum of the hippocampal formation, Journal of Neuroscience, 29(31), 9771-9777.
  3. Taube et al. (1990) Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis, Journal of Neuroscience, 10(2), 420-435.

Seminar 7 – Process vs representational accounts of the medial temporal lobes


Building on the content from seminar 6, we cover two dominant theories in the literature in relation to how to best characterise the medial temporal lobes – namely process vs representational accounts. We discuss the key research in humans that provided some of the first clear evidence in favour of representational accounts. We finish by trying to reconcile the representational account with the “memory” deficits that patients with MTL damage present with. In particular, we discuss how certain processes (e.g., episodic memory) may rely more heavily on specific representations (e.g., complex configural representations supported by the hippocampus) than other representations.

Learning outcomes


After the lecture, the student will be able to:
  • Appreciate the distinction between process and representational accounts
  • Critically appraise evidence for and against these differing accounts

Key Reading


  1. Ranganath et al. (2001) Medial temporal lobe activity associated with active maintenance of novel information, Neuron, 31(5), 865-873.
  2. Hartley et al. (2007) The hippocampus is required for short-term topographical memory in humans, Hippocampus, 17, 34-48.
  3. Lee et al. (2005) Specialization in the medial temporal lobe for processing of objects and scenes, Hippocampus, 15(6), 782-797.
  4. Barense et al (2007) The medial temporal lobe processes online representations of complex objects, Neuropsychologia, 45(13), 2963-2974.

Further Reading

  1. Lee et al. (2008) Activating the medial temporal lobe during oddity judgement for faces and scenes, Cerebral Cortex, 18(3), 683-696.
  2. Wang et al. (2010) The medial temporal lobe supports conceptual implicit memory, Neuron, 68(5), 835-842.
  3. Schnyer et al. (2006) Rapid response learning in amnesia: delineating associative learning components in repetition priming, Neuropsychologia, 44(1), 140-149.
  4. Nadel & Hardt. (2011) Update on memory systems and processes, Neuropsychopharmacology Reviews, 36(1), 251-273.

Seminar 8 – The medial temporal lobe beyond episodic memory


In the last seminar, I present some of the studies I think are most interesting in relation to the medial temporal lobes – suggesting they play a role in episodic future thinking, scene construction, imagined navigation, decision-making, and moral judgements. For the second half of the seminar, the students split into groups and go over the 8 seminars discussing content they found most challenging. We then have a group revision session where I help to clarify any material they may not understand, or we discuss topics that they find particularly interesting. After this, they leave with smiles on their faces and lead productive, happy, successful lives (this has nothing to do with that fact they completed my module though).


Learning outcomes


After the lecture, the student will be able to:
  • Appreciate that the medial temporal lobes aren't solely a 'memory' structure
  • Evaluate research showing medial temporal lobe involvement in non-memory tasks
  • Discuss what role the medial temporal lobes play in our mental lives

Key Reading


  1. Addis et al. (2007) Remembering the past and imagining the future: Common and distinct neural substrates during event construction and elaboration, Neuropsychologia, 45(7), 1363-1377.
  2. Hassabis et al. (2007) Patients with hippocampal amnesia cannot imagine new experiences, Proceedings of the National Academy of Science, 104(5), 1726-1731.
  3. Wimmer et al. (2012) Preference by association: How memory mechanisms in the hippocampus bias decisions, Science, 338(6104), 270-273.

Further Reading


  1. McCormick et al. (2016) Hippocampal damage increases deontological responses during moral decision making, Journal of Neuroscience, 36(48), 12157-12167.
  2. Zeithamova et al. (2012) Hippocampal and ventral medial prefrontal activation during retrieval-mediated learning supports novel inference, Neuron, 75(1), 168-179.
  3. Horner et al. (2016) Grid-like processing of imagined navigation, Current Biology, 26, 842-847.

Areas not currently covered (but could be)


The module is necessarily selective, and there are some topics I would like to cover but don’t feel I have the time. These include:

  • Systems consolidation – we mention this in passing, but don’t cover it systematically. However, there is another advanced module that focusses on sleep and memory, so if I were to include it there might be too much overlap between modules.
  • Brain networks – the module is heavily focussed on the medial temporal lobes. That partly reflects my research interests, but it also reflects my desire for the students to focus more on the broader theoretical questions (e.g., process vs representational accounts) as opposed to the neuroscience.
  • Memory and emotion – this is a big topic, but one students would definitely find interesting. If students are taking this module as part of a Masters level degree, they do have the option of reading some of this literature.
  • Hippocampal longitudinal axis – the module focusses on hippocampal subfields at the expense of the anterior-posterior axis of the hippocampus. As in (3), if students are taking this module as part of a Masters level degree, they do have the option of reading some of this literature.
  • Forgetting – this reflects my own shifting research interests but forgetting is a fascinating topic with a rich psychological and neuroscientific history.


I’m sure there are other topics as well, but these are the ones I have thought about including previously. As above, if you have any thoughts or suggestions, feel free to email me. I think that just about covers it. I hope this is useful to some – possibly just as a way of figuring out what NOT to teach. Happy teaching to you all.

Monday 26 March 2018

Excuses for not taking extended paternity leave


Men usually get 2 weeks of paternity leave in the UK. Most men I know have taken this leave when their child was born, and gone straight back to full time work following this. Perhaps they took holiday days following this when needed. This compares to women who are allowed up to 1 year of maternity leave (pay depends on your employer, but typically decreases over this time and the final 3 months are usually unpaid). Most women I know have taken somewhere between 6 months to 1 year, with 9 months being quite common. It goes without saying that this results in a big difference in parental roles in the first year of a child’s life in the UK, with a host of demonstrable knock-on effects in terms of career progression and pay that can last a lifetime.

There is provision for men to take off more time, through shared parental leave. This has been available since 2015, and allows the couple to share up to 50 weeks of leave in a relatively flexible manner. I am currently 1 month into a 3-month stint of parental leave (from 9 months to 1 year) and my wife went back to work at 9 months. However, I know few men who have committed to more paternity leave than the usual 2 weeks despite this legal entitlement.

Below is a list of a few “excuses” that I have heard in relation to this. A few caveats first though. First, I am assuming you are committed to equality between men and women. If you’re not, take a long walk off a short pier. Second, I am writing as someone in a heterosexual relationship, to other men in a similar relationship. This is because it is what I have experience with, and it’s probably where most change needs to occur in terms of questioning gender roles. Third, I am presuming you are eligible for shared parental leave in the UK. If you’re not, this is a pretty good excuse for not taking shared parental leave. Finally, this is not meant to be judgmental, or if it is I apply it equally to myself as well as others. I think I have probably used most of these excuses either implicitly or explicitly when making decisions about the length of my paternity leave for both my daughters (~2 months for the first on a relatively ad hoc basis, 3 months for the second). Finally, some of these excuses are legitimate in certain situations, which is perhaps why they are easy to use as a post hoc way of justifying your decision. The point of this list is to identify these common excuses so we can question them more thoroughly next time we’re making a decision.

1. I didn’t know I could
This one is easy to deal with. It isn’t an excuse. If you wanted to take extended paternity leave, you would have googled it. Ignorance isn’t an excuse.

2. It isn’t part of my work culture
You happen to be in a job where men don’t typically take long periods of paternity leave? What are the chances of that happening? Well, fairly high given that’s the current norm in UK society. I would ask yourself two questions though: (1) do I think this culture should change, and (2) have I previously pushed for change at work in other ways (not related to paternity leave)? If you answered yes to both of these questions, or just yes to the first one, then perhaps you’re realising this isn’t a great excuse either. You are legally entitled to shared parental leave. The only way it becomes part of work culture is for men to start taking advantage of this opportunity.

3. I earn more than my wife
So what? If anything, this is a reason to take more time off, so maternity leave has less of an effect on your wife’s pay packet. You taking time off might help to redress the balance, or at least not exacerbate it.

4. We can’t afford it
This may be a completely legitimate excuse. However, it’s still important to question this. For example, you are presumably already taking a financial hit with your wife being on maternity leave. Why did you decide this was affordable but you taking time off wasn’t? Did you sit down and carefully do the sums, or did you decide relatively quickly that whatever financial cost associated with your wife taking maternity leave was worth the sacrifice but somehow it wasn’t with you taking time off?

This obviously interacts with excuse 3. If you earn more, you will take more of a financial hit by taking time off. However, it might be worth asking why you are earning more. Did your wife not apply for that dream job during early pregnancy because she had better maternity cover at her current job? Did you move city for your job and not hers even before you were considering having children?

5. It would affect my career progression
Good. It’s about time men risked taking a hit, given the sacrifices women have been making in terms of career progression and pay. Ultimately though, I think it unlikely that (e.g.) 3 months away from work is really going to have a massive effect on your career. The mere fact that you are having a child is going to affect your career in some way, so taking a bit of time off isn’t going to make things much worse. A related excuse is "it's not a good time career-wise". It never is, get over it.

6. My wife wants to take the full year of maternity leave
This is a trickier one. I completely understand why someone might want to take a full year of maternity leave (leaving you with no “shared” time to take off). There are a few things to ask though. First, how much of this desire is based on societal expectations? I have seen pregnant women questioned when they say they’re planning on taking “only” 6 months off. Second, do you always acquiesce to your wife’s wishes, or do you typically talk things through and arrive at a mutual decision that takes into account both your wishes? I would hope that any equal partnership would be able to deal with this amicably, and your wife would be able to see the positives in you spending more time with your child.

7. I’m scared, and not very good at looking after my baby
It’s OK to be scared. I was terrified when I made the decision. You have to make it relatively early on though, and then by the time it comes around you’re already committed so can’t do anything about it. You will mess things up. You will find it hard. You will find it rewarding though, and you will enjoy yourself (at least some of the time).


***EDIT***
I'll keep adding to the list based on suggestions from Twitter.

8. My wife is breastfeeding our baby
Thanks to Catherine Manning and Jenni Rodd for pointing this out. This is potentially a good excuse for not taking paternity leave early in the first year, but doesn't apply as much to taking time off later in the year. This can certainly make things logistically difficult if your wife is still breastfeeding when she goes back to work. There are ways around this though. I have talked to female colleagues who have gone back to work relatively early. They have got by through pumping as well as the man bringing the child to work once a day to be breastfed. As I said, logistically difficult but not necessarily a deal breaker. If this just isn't feasible, consider taking the final few months off, where your child is likely to be eating mostly solids and can much more easily be fed occasionally with a bottle. 

Thursday 12 October 2017

Being an academic

This blog post was prompted by a long twitter conversation on science careers, number vs quality of publications, and the inevitable inclusion of the term ‘glam mags’. Although twitter is great for an immediate exchange of ideas, it isn’t good for nuance. Here are some of my thoughts on a career in academia.

First, if you want to be an academic in the long-term, you probably can be. It may not be the academic career you dreamt of, or at the university you wanted to be at. You may not be teaching exactly what you want. You may not have much time for research at all. However, if you decide to be an academic, you work hard, and you aren’t precious about your definition of ‘success’, then it’s a career like any other. There are plenty of people out there without glittering CVs who are grafting and getting by, but ultimately are fully fledged academics who do amazing jobs day-after-day for little praise or reward.

Much of the conversation on twitter relates to academic ‘success’ rather than simply what it takes to be an academic long-term. People obviously have high expectations, and want to be successful. I get that. I want to do amazing science with clever supportive colleagues, teach happy engaged students, and have a vibrant well-funded lab. Where I think there might be an issue is the expectation that this is what it means to be an ‘academic’. To me, this is what it means to be a high-flying successful academic. None of us have a right to expect such a career. I am grateful that I have a good solid job in a university I like. If I have success over-and-above this, it is a massive bonus. I will of course be under pressure from those above me to be that high-flying academic, and I will work towards that as best as I can, but I can’t expect it and it certainly isn’t a right.


So what is my advice to those more junior than myself? During your PhD, do the best possible science you can and try to ensure you publish at least one good solid paper. This is clear evidence that you can push a project from start to finish, and that you have developed a set of experimental/methodological skills. If you feel you have time, apply for fellowships that might propel your career into the stratosphere, but don’t bank on it. Concentrate on finding a postdoc in a lab where you feel you would fit in, you can further your skills set, and you can do good science. Join twitter and read conversations about careers, glam mags, etc. but don’t let it get under your skin. Understand the system you are trying to navigate, but appreciate that there is a huge amount of inherent noise such that there is no one-size-fits-all recipe for career success. Most importantly, enjoy yourself as much as possible. If you do commit to academia and don’t get that dream job, you will probably look back on your PhD and postdoc positions rather fondly. If you do stay in academia though, well done you. Regardless of what your CV looks like, you’re a success.

Friday 16 December 2016

My first year as a lecturer (actually 11 months)

A few months ago I blogged about my first 6 months as a lecturer. This is the (almost) one year since starting follow up. Last time I confessed to feelings of loneliness as I was tasked with writing grants and setting up a lab before the teaching and administrative load started to ramp up. Well, the latter has now happened. I have been given a relatively substantial administrative position (though certainly not the most arduous) and although I am yet to deliver lectures, this term I have dedicated a significant amount of time to activities related to teaching. This means I have had virtually zero time to do research, but all is not lost on this front (see good news below).

Administratively I am now “strand-leader” of the neurosciences strand of the Natural Sciences programme at York. This does take up a reasonable amount of time, but certain aspects of the job are actually quite fun. I have spent several afternoons interviewing incredibly bright A-level students (most are expected to get straight A/A*s), asking them about their scientific interests and trying to get them to think laterally about topics/questions they may not have encountered before. The sheer enthusiasm of many of these students is tonic for the soul. On the other hand, I had to sit through a 3 hour board of studies meeting the other day.

Teaching this term has consisted of supervising “literature surveys” – talking to final year students in small groups (~7) about their topic of interest, helping them frame their specific question/topic, and providing feedback on their plans. I have also started to supervise 3rd year empirical projects, though the bulk of that occurs next term. Finally, as I am teaching two modules next term, I have spent a considerable amount of time reading around certain topics, and preparing lectures. This can be rewarding as it forces you to go back to basics in a particular area, but it is hugely costly in terms of time and is relatively open-ended. As a new lecturer how do I know when I have read “enough” to teach the topic to 2nd or 3rd year BSc students? My conclusion was if I care enough to worry about it, I’m conscientious enough to do a good job, but time will tell.

Finally, although I currently have very little time for research, I am fortunate in that I managed to secure a (smallish) grant to employ a postdoc over the next 2 years. That, alongside a PhD student who started in October, means data will be accruing whilst I am teaching next term. It could be a lot worse, and to be honest it couldn’t get much better.

The contrast between the first and second 6 months has been stark. The increased administrative and teaching load has been a shock to the system, and for a 3-4 week period I really felt like I was drowning. I am just as busy now, but I seem to be learning to deal with it a bit better. I am getting better at trusting that I can get things done relatively last minute if necessary (this goes against my innate nature, so has been difficult to learn). The upside to this increased stress has been a sense of increased belonging. The negative way of putting this would be “siege mentality”, but I feel these is a sense of camaraderie with colleagues that I haven’t felt before. Alongside this is an increased awareness of the “big picture” - understanding how the university and department function and how the nitty-gritty of day-to-day teaching/research/administration works. This helps in generating distance from the minor setbacks one receives. As with last time, a few possible words of advice:
  1. If you have time before teaching starts, get grants in ASAP. Smaller ones in particular that have a faster turnaround time and potentially have a higher chance of success. Concentrate on getting money for personnel – having individuals to collect data will reap rewards when you’re teaching; an expensive bit of kit that you have no time to use won't.
  2. As I suggested last time, talk to colleagues as much as possible. Get their advice when you’re struggling. Ask questions about the department and how it functions. Turn up to meetings and talks. Become part of the academic community.
  3. Realise that your first lecture isn’t going to be perfect. Cover a sensible amount of material as clearly as possible. Don’t worry too much about whether the students walk away thinking you’re amazing. Make sure they have learnt something in the 1-2 hours you have with them.
  4. Accept that you won’t have control of everything at all times. Allow some things to slip if you have to. Prioritise your time as effectively as possible. It may feel like you’re constantly putting out fires (that’s certainly how I feel), but as long as nothing develops into an inferno then I count that as a success.