Yoichi Ishibashi

Japanese/ηŸ³ζ©‹ι™½δΈ€ he/him

profile.jpg

Yoichi Ishibashi is a Researcher at NEC.

My research interest lies in LLMs, Coding Agents, and Self-Evolving LLMs.

For more details, please refer to my CV.

news

May 22, 2025 Our paper Mining Hidden Thoughts from Texts: Evaluating Continual Pretraining with Synthetic Data for LLM Reasoning has been published on arXiv.
Feb 13, 2025 Our paper Can Large Language Models Invent Algorithms to Improve Themselves? has been accepted to NAACL 2025πŸŽ‰
Oct 22, 2024 Our paper Can Large Language Models Invent Algorithms to Improve Themselves? has been published on arXiv.
Mar 14, 2024 Our paper Subspace Representations for Soft Set Operations and Sentence Similarities has been accepted to NAACL 2024πŸŽ‰
Oct 07, 2023 Our paper, titled Knowledge Sanitization of Large Language Models, has been published on arXiv.

selected publications

  1. paper_2025a.jpg
    Mining Hidden Thoughts from Texts: Evaluating Continual Pretraining with Synthetic Data for LLM Reasoning
    Yoichi Ishibashi, Taro Yano, and Masafumi Oyamada
    arXiv, 2025
  2. paper_2024b.jpg
    Can Large Language Models Invent Algorithms to Improve Themselves?
    Yoichi Ishibashi, Taro Yano, and Masafumi Oyamada
    NAACL, 2025
  3. paper_2024a.jpg
    Self-Organized Agents: A LLM Multi-Agent Framework toward Ultra Large-Scale Code Generation and Optimization
    Yoichi Ishibashi and Nishimura Yoshimasa
    arXiv, 2024
  4. paper_2023b.jpg
    Knowledge Sanitization of Large Language Models
    Yoichi Ishibashi and Hidetoshi Shimodaira
    arXiv, 2023
  5. paper_2023a.jpg
    Evaluating the Robustness of Discrete Prompts
    Yoichi Ishibashi, Danushka Bollegala, Katsuhito Sudoh, and 1 more author
    EACL, 2023