NISS Writing Workshop for Junior Researchers 2025: Building Strong Foundations in Academic Writing and Grant Proposals 

Event Page: Writing Workshop for Junior Researchers 2025 - Day 1 Online | National Institute of Statistical Sciences 

Writing Workshop for Junior Researchers 2025 – Day 1: Building Strong Foundations in Academic Writing and Grant Proposals 

Friday, July 18, 2025 | National Institute of Statistical Sciences 

The Writing Workshop for Junior Researchers 2025 opened on Friday, July 18, with an online program dedicated to strengthening academic writing, publishing strategies, and grant proposal skills for early-career statisticians and data scientists. Organized by the National Institute of Statistical Sciences (NISS) and co-led by Piaomu Liu, the event brought together experienced scholars to share practical guidance and personal experiences with the next generation of researchers. 

Opening the Workshop 

The day began with opening remarks from David Matteson (Cornell University), Director of NISS. He welcomed participants, introduced NISS’s mission, and highlighted the organization’s new initiatives in data science, public health, and AI. Matteson emphasized the importance of clear communication in research and previewed the workshop’s focus on ethical writing practices, publishing strategies, and uncertainty quantification. 

Academic Writing as an Act of Kindness 

The first tutorial, led by Donald Richards (Penn State University), was titled “Writing Well as an Act of Kindness for Our Students and Colleagues.” Richards encouraged participants to view writing not only as a scholarly obligation but also as a way of supporting colleagues, students, and the broader academic community. Drawing on personal experience and examples from renowned statisticians and mathematicians, he stressed the importance of clarity, simplicity, and revision. Richards urged junior researchers to read widely, seek good mentors, and practice communicating their ideas in ways that benefit both themselves and others. 

Choosing Where to Publish 

In the second tutorial, “Choosing Where to Publish,” Nick Jewell (University of California, Berkeley) shared insights from his extensive publishing experience. Jewell encouraged participants to aim high in their choice of journals while preparing backup plans in case of rejection. He highlighted the importance of accessible writing, polished copy-editing, and up-to-date applications. Jewell also discussed the role of cover letters, the balance between collaboration and independence, and the evolving role of AI tools such as ChatGPT in manuscript preparation. 

Grant Writing Panel 

The afternoon featured an in-depth panel on grant writing moderated by Daniel Kowal (Cornell University). Panelists Brani Vidakovic (Texas A&M University), Yulia R. Gel (Virginia Tech), and Ander Wilson (Colorado State University) drew on their experiences with NSF and NIH funding to provide practical guidance. 

Vidakovic and Gel emphasized mentorship, iterative feedback, and careful attention to solicitation guidelines when preparing NSF proposals. They warned against incremental projects and advised tailoring proposals to specific programs while reaching out to program officers for clarification. Wilson shared his experiences with NIH awards, underscoring the need to highlight health implications and build interdisciplinary teams. 

The panel also covered broader strategies, including using templates to streamline proposal writing, beginning budgets early, and incorporating broader impacts. Panelists discussed handling rejections constructively, learning from serving as proposal reviewers, and exploring opportunities with foundations and industry partners. 

Reviewing and Revising 

The final tutorial of the day, “Reviewing and Revising,” was led by Naomi Altman (Penn State University). Altman shared her deep expertise in the peer review process, stressing the importance of thorough revisions, constructive responses to reviewer comments, and transparent collaboration with co-authors. She advised junior researchers to plan early for promotion and tenure requirements, maintain rigorous record-keeping practices, and write in ways that remain accessible to readers outside their specialties. 

Mentoring Connections 

Throughout the day, participants also engaged in mentor–mentee meetings over a virtual lunch and optional follow-up sessions. These smaller discussions gave junior researchers the opportunity to build relationships, receive personalized feedback, and extend conversations from the formal sessions. 

Day 1 set a strong foundation for the workshop, equipping attendees with tools for effective writing, publishing, and grant preparation. With insights from experienced scholars, participants left with both practical strategies and inspiration for advancing their research careers. The workshop continues with additional tutorials and panels designed to support the professional development of junior researchers in statistics and data science. 

Day One – Key Takeaways 

These recommendations, shared by workshop speakers and panelists, highlight strategies for successful grant applications, professional growth, and effective publishing in statistics and data science. 

Grant Writing & Funding 

  • Start Early: Begin grant preparation months in advance and seek feedback from colleagues. 
  • Know the Rules: Read the NSF Proposal & Award Policies & Procedures Guide carefully. 
  • Show Impact: Make broader impacts concrete by collaborating with university resources. 
  • Be Novel: Clearly demonstrate innovation and avoid incremental proposals. 
  • Engage Program Officers: 
    • NIH – consult before submission (specific aims) and after scores (feedback). 
    • NSF – reach out for new/interdisciplinary programs, but don’t send full proposals. 
  • Think Broadly: Apply to NSF directorates outside statistics when methods can advance other fields. 
  • Budget Smart: Begin budgets early and coordinate with administrators. 
  • Stay Informed: Sign up for funding newsletters. 
  • Get Involved: Register as a proposal reviewer to gain insights. 
  • Alternatives: Consider Simon Foundation grants if new to federal funding. 
  • Avoid Pitfalls: Don’t submit the same proposal to multiple NSF programs. 

Professional Development 

  • Meet early with your department’s Promotion & Tenure chair to learn requirements. 
  • Co-author a paper with a senior colleague to build credibility for tenure. 

Manuscript Writing & Publishing 

  • Write detailed cover letters addressing reviewer/editor comments. 
  • When resubmitting, address prior reviewer comments, even at a new journal. 
  • Highlight changes in revised manuscripts when feasible. 
  • Have all co-authors review and sign off on final drafts and galley proofs. 
  • Keep code and data well-documented and accessible for revisions. 
  • Follow up with journals if no decision in 3 months. 
  • Adhere strictly to journal formatting requirements. 
  • Ensure supplemental files are accurate and uploaded correctly. 
  • Include software versions and experimental design details when required. 

Peer Review & Editorial Process 

  • Complete reviews within 2–4 weeks of accepting assignments. 
  • Corresponding authors: carefully coordinate galley proof corrections with editors. 

NISS Writing Workshop 2025 – Day Two Focuses on Collaboration, Publishing, and Ethics in Research 

Friday, July 25, 2025 | National Institute of Statistical Sciences 

Day two of the NISS Writing Workshop 2025 brought participants back together online for a packed program of tutorials, panels, and breakout sessions designed to strengthen research writing, foster mentorship, and deepen understanding of publishing practices in statistics and data science. 

Megan Glenn opened the session by previewing the day’s agenda, which featured tutorials on collaborative paper writing and publication ethics, a panel discussion with journal editors, mentor–mentee networking, and a concluding happy hour. She also reminded attendees that Day 3 of the workshop would take place in person at JSM, featuring a short course on ChatGPT and a career development panel. 

Writing Collaborative Papers 

The morning began with Dr. Susan Ellenberg (University of Pennsylvania) leading a tutorial on How to Write a Collaborative Paper. Drawing on her extensive experience in clinical trial design and oversight, Ellenberg emphasized the statistician’s role in ensuring clarity, precision, and reproducibility in research writing. She highlighted common challenges—such as exclusions from analyses, multiplicity of comparisons, and post-hoc interpretations—and urged statisticians to advocate for transparency and honesty, even when facing resistance. 

Ellenberg also discussed authorship practices, noting that statisticians who make significant contributions should often be listed as second author. She encouraged concise writing and full participation in drafting and reviewing all parts of a manuscript, not just the statistical sections. 

Insights from Journal Editors 

The workshop then transitioned into Panel 2: Statistics and Data Science Journals, moderated by Toryn Schafer (Texas A&M University). 

Panelists included: 

  • Rebecca Hubbard (Brown University), Co-Editor of Biostatistics and Statistical Editor for the New England Journal of Medicine 
  • Bill Rosenberger (George Mason University), Co-Editor of Biometrics 
  • Kate Calder (University of Texas at Austin), Area Editor for the Annals of Applied Statistics 

The panel offered guidance on journal selection, peer review, and the evolution of academic publishing. Hubbard emphasized that publishing should prioritize communicating new ideas rather than simply expanding publication lists, while Rosenberger described the realities of high rejection rates and the importance of submitting work to journals that are the right fit. He also introduced Biometrics’ new “Best Referee Award” to recognize excellence in peer review. Calder underscored the need for thoughtful literature reviews and submissions that integrate methodological innovation with real-world data applications. 

The discussion touched on reproducibility, code sharing, and the challenges of dealing with inadequate or non-expert reviews. Panelists encouraged persistence in the face of rejection and advised junior researchers to carefully match their work with journal scope. 

Ethics and Reproducibility 

After a virtual lunch break and mentor–mentee meetings, the workshop reconvened for Tutorial 5: Ethical Issues and Reproducibility led by Peter Imrey (Cleveland Clinic Lerner College of Medicine). 

Imrey’s talk highlighted ethical challenges in research publishing, including plagiarism, ghost authorship, selective reporting, and conflicts of interest. He stressed the responsibility statisticians have in ensuring reproducibility, accurate reporting, and integrity in both academic and industry settings. Imrey also raised questions about the impact of AI tools on authorship and urged participants to reflect on how new technologies may shape the future of scientific publishing. 

Breakout Discussions and Networking 

The afternoon continued with small group mentor–mentee breakout sessions, where participants shared experiences, career advice, and strategies for handling the peer review process. Technical difficulties initially slowed the assignment of participants to breakout rooms, but these were quickly resolved. 

The day concluded with a casual happy hour networking session, allowing participants to continue conversations, swap contact information, and strengthen professional connections in an informal setting. 

With collaboration, publishing, and ethics at the center of Day Two, participants left with practical strategies for writing, submitting, and reviewing research in statistics and data science. The workshop’s final day will take place at JSM 2025, featuring an in-person short course on ChatGPT and a panel on career development. 

Key Takeaways from Day Two of the NISS Writing Workshop 2025 

Collaborative Writing & Authorship 

  • Statisticians should review all sections of collaborative papers—not just statistical content—to ensure accuracy and appropriate interpretation. 
  • Authorship recognition matters: statisticians making significant contributions should typically be listed as the second author. 
  • Collaborators should include full and transparent descriptions of methods, sample size, and analyses, with clear explanations of exclusions. 
  • Clear communication between collaborators is essential, including negotiation on reporting details, turnaround times, and manuscript content. 

Statistical Rigor & Reproducibility 

  • Statistical methods must be fully described for reproducibility, with supplementary appendices used for technical detail when needed. 
  • Address multiple testing issues explicitly, noting adjustments or clarifying when none are applied. 
  • Analyses should be clearly identified as pre-specified or post hoc, with confidence intervals and p-values reported. 
  • Both absolute and relative effects should be included when reporting binary outcomes. 
  • Papers should include CONSORT checklists, registration numbers, and data-sharing information for clinical trials. 
  • Authors should always reference statistical methods, even familiar ones, to ensure clarity. 

Publishing & Peer Review 

  • Select journals carefully based on novelty, audience, and type of work. Persistence is key—rejections are often about fit, not errors. 
  • Authors should conduct thorough literature reviews, cite recent work, and tailor submissions to journal scope. 
  • Respond to all referee comments tactfully, defending statistical decisions when needed, and consult editors in cases of conflicting reviews. 
  • Editors and journals are encouraged to reward high-quality refereeing and assign papers to appropriate area editors. 
  • Providing code supplements and ensuring reproducibility are becoming essential publishing standards. 

Ethics & Professional Practice 

  • Statisticians must follow ASA ethical guidelines, uphold integrity, and avoid practices like ghost authorship or selective reporting. 
  • Be transparent about the use of AI in research and publications. 
  • Recognize and disclose potential conflicts of interest in both academic and industry collaborations. 
  • Maintain boundaries between research and roles in product marketing or medical communications. 
  • Be prepared to take responsibility for statistical analyses and results in collaborative projects. 
  • Reproducibility and transparency are non-negotiable—research findings must be accurate, verifiable, and ethically reported. 

Mentorship & Professional Growth 

  • Junior faculty and PhD students benefit from mentorship in reviewing papers, with students ideally completing a review with their advisor before graduation. 
  • Peer review is a learning process: uncomfortable experiences with referees can provide valuable lessons for improving future work. 
Sunday, August 10, 2025 by Megan Glenn