This tutorial examines how ChatGPT can assist journal editors in improving the efficiency and effectiveness of academic publishing. It highlights ChatGPT’s key characteristics, focusing on the use of “Custom instructions” to generate tailored responses and plugin integration for accessing up-to-date information. The tutorial presents practical advice and illustrative examples to demonstrate how editors can adeptly employ these features to improve their work practices. It covers the intricacies of developing advanced prompts and the application of zero-shot and few-shot prompting techniques across a range of editorial tasks, including literature reviews, training novice reviewers, and improving language quality. Furthermore, the tutorial addresses potential challenges inherent in using ChatGPT, which include a lack of precision and sensitivity to cultural nuances, the presence of biases, and a limited vocabulary in specialized fields, among others. The tutorial concludes by advocating for an integrated approach, combining ChatGPT’s technological advancements with the critical insight of human editors. This approach emphasizes that ChatGPT should be recognized not as a replacement for human judgment and expertise in editorial processes, but as a tool that plays a supportive and complementary role.
Purpose This study aimed to ascertain the attitudes of Korean scholarly journal editors and publishers toward research data sharing policies and the publication of data papers through a survey.
Methods Between May 16 and June 16, 2023, a SurveyMonkey survey link was distributed to 388 societies, including 270 member societies of the Korean Council of Science Editors and 118 societies that used an e-submission system operated by the Korea Institute of Science and Technology Information. A total of 78 societies (20.1%) responded, from which 72 responses (18.6%) were analyzed after excluding invalid responses.
Results Out of the representatives of 72 journals, 20 editors or publishers (27.8%) declared a data sharing policy. Those journals that did not have such a policy often expressed uncertainty about their future plans regarding this issue. A common concern was a potential decrease in manuscript submissions, primarily due to the increased workload this policy might impose on editors and manuscript editors. Four respondents (5.6%) had published data papers, with two of them including this as a publication type in their author guidelines. Concerns about copyright and data licensing were cited as drawbacks to publishing data papers. However, the expansion of publication types and the promotion of data reuse were viewed as benefits.
Conclusion Korean scholarly journal editors’ and publishers’ attitudes toward data sharing policy and publishing data papers are not yet favorable. More training courses are needed to raise awareness of data sharing platforms and emphasize the need for research data sharing and data papers.
Purpose This study aimed to develop a decision-support tool to quantitatively determine authorship in clinical trial publications.
Methods The tool was developed in three phases: consolidation of authorship recommendations from the Good Publication Practice (GPP) and International Committee of Medical Journal Editors (ICMJE) guidelines, identifying and scoring attributes using a 5-point Likert scale or a dichotomous scale, and soliciting feedback from editors and researchers.
Results The authorship criteria stipulated by the ICMJE and GPP recommendations were categorized into 2 Modules. Criterion 1 and the related GPP recommendations formed Module 1 (sub-criteria: contribution to design, data generation, and interpretation), while Module 2 was based on criteria 2 to 4 and the related GPP recommendations (sub-criteria: contribution to manuscript preparation and approval). The two modules with relevant sub-criteria were then differentiated into attributes (n = 17 in Module 1, n = 12 in Module 2). An individual contributor can be scored for each sub-criterion by summing the related attribute values; the sum of sub-criteria scores constituted the module score (Module 1 score: 70 [contribution to conception or design of the study, 20; data acquisition, 7; data analysis, 27; interpretation of data, 16]; Module 2 score: 50 [content development, 27; content review, 18; accountability, 5]). The concept was integrated into Microsoft Excel with adequate formulae and macros. A threshold of 50% for each sub-criterion and each module, with an overall score of 65%, is predefined as qualifying for authorship.
Conclusion This authorship decision-support tool would be helpful for clinical trial sponsors to assess and provide authorship to deserving contributors.
With the objective of improving the quality of Korean journals and elevating them to international standards, the National Research Foundation of Korea, in consultation with Elsevier, formed the Scopus Expert Content Selection and Advisory Committee-Korea (ECSAC-Korea) as a local selection committee in August 2012. The committee reviews Korean journals for Scopus indexing and recommends them to the Scopus Content Selection and Advisory Board. In September 2019, ECSAC-Korea became part of the Korean Council of Science Editors (KCSE). This article describes the current status of Scopus indexing in Korea and the history, organizational structure, and role of ECSAC-Korea as part of the KCSE. The article also introduces the members of ECSAC-Korea and the KCSE steering committee for Scopus ECSAC-Korea, who have been active since September 2019.
Citations
Citations to this article as recorded by
Role of academic publishers in 10 years: a perspective from the Chairman of Elsevier Youngsuk Chi Science Editing.2022; 9(1): 46. CrossRef
Presidential address: How to cope with the present environment of scholarly journal publishing Sun Huh Science Editing.2020; 7(1): 1. CrossRef
For editors and manuscript editors, the romanization of Korean characters is a topic that should be understood thoroughly, because Korean proper nouns have become more widely used worldwide due to phenomena such as Hallyu (the Korean wave). In this report, I describe the 2 major romanization systems used in Korea: the Korean government’s romanization system and the McCune-Reischauer system. I also describe the transliteration guidelines presented in a variety of reference styles, such as the CSE (Council of Science Editors), ACS (American Chemical Society), AMA (American Medical Association), APA (American Psychological Association), IEEE (Institute of Electrical and Electronics Engineers) styles and the NLM (National Library of Medicine) style guide. I found that 2 journals have adopted the Korean government’s romanization system, while 10 use the McCune-Reischauer system. Other journals do not specifically mention a romanization system. Editors should select a romanization system and use it consistently. When presenting a reference that includes romanized text, the journal’s house style should be followed, based on international reference citation styles. Chinese characters in documents published in Korea should be romanized according to the Korean pronunciation.
This is a republication of Appendix 1, The Golden Rules and the Peer-Review Good Practice Checklist, from the author’s book, Peer Review and Manuscript Management in Scientific Journals: guidelines for good practice, published in 2007 by Wiley-Blackwell in association with ALPSP (the Association of Learned and Professional Society Publishers), with the permission of the author and publisher (ISBN: 978-1-4051-3159-9, http://eu.wiley.com/WileyCDA/WileyTitle/ productCd-1405131594.html).
Citations
Citations to this article as recorded by
Ten Tips for Performing Your First Peer Review: The Next Step for the Aspiring Academic Plastic Surgeon Martin Frendø, Andreas Frithioff, Steven Arild Wuyts Andersen Archives of Plastic Surgery.2022; 49(04): 538. CrossRef
Do’s and Don’ts for a Good Reviewer of Scientific Papers: A Beginner’s Brief Decalogue Miltos K. Lazarides, George S. Georgiadis, Nikolaos Papanas The International Journal of Lower Extremity Wounds.2020; 19(3): 227. CrossRef
Writing highly effective reviews of a scientific manuscript Garry J. Scrimgeour, Shelley D. Pruss Freshwater Science.2016; 35(4): 1076. CrossRef