Ephapsys
Staff Research Engineer, Neural Architectures
San Francisco or Remote
About Ephapsys
We leverage ephaptic coupling to pioneer a new paradigm for empowering and governing autonomous AI agents across industries.
As billions of agents spread across clouds, devices, and biological substrates, Ephapsys aims to be the decentralized identity and integrity layer that keeps every agent trusted, safe, and under control.
The Role
We are seeking a Staff Research Engineer, Neural Architectures to lead the advancement of ephaptic coupling technology across modern AI systems.
This role is research-intensive and focused on generating novel, experimentally validated improvements in neural architecture design.
You will work directly with the founder to formalize theoretical models, design controlled experiments, and deliver measurable breakthroughs in performance, computational efficiency, robustness, and adaptability.
The role includes producing publishable research while ensuring practical integration into the Ephapsys SDK and platform.
Responsibilities
Neural Architecture Innovation
- Design and formalize new ephaptic interaction operators within deep learning architectures
- Develop mathematical models describing cross-neuron interaction dynamics
- Integrate ephaptic mechanisms into transformers, MLPs, and hybrid architectures
- Conduct rigorous ablation studies and theoretical analysis of convergence behavior
- Investigate stability properties and dynamical systems implications of modulation layers
Performance & Efficiency Breakthroughs
- Deliver measurable improvements in training efficiency, inference latency, or generalization
- Explore adaptive modulation strategies for improved robustness and transfer learning
- Analyze computational complexity and scalability trade-offs
- Evaluate hardware-aware and distributed training optimizations
Research & Publication
- Produce internal technical reports documenting experimental findings
- Contribute to peer-reviewed publications and conference submissions
- Present research findings in technical forums when appropriate
- Stay current with frontier model research including transformer alternatives and emerging paradigms
SDK & Platform Integration
- Translate validated research into modular, production-ready SDK components
- Build reusable frontier model modules and research-grade agents
- Establish benchmarking standards to quantify ephaptic performance gains
- Collaborate with security engineering to evaluate new attack surfaces introduced by novel operators
Qualifications
Required
- PhD in Machine Learning, AI, Applied Mathematics, Computational Neuroscience, or related field
- Strong mathematical foundation (linear algebra, optimization, dynamical systems)
- Deep expertise in neural network theory and architecture design
- Advanced proficiency in PyTorch or TensorFlow
- Experience conducting reproducible research with controlled experimental design
- Demonstrated ability to translate theory into working implementations
Preferred
- Experience training or fine-tuning large-scale models
- Familiarity with distributed training and performance optimization
- Prior publications in reputable AI/ML conferences or journals
- Interest in expanding neural computation beyond synaptic-only paradigms
- Experience contributing to open-source ML frameworks
What Success Looks Like
Within the first 3-6 months, you will have demonstrated experimentally validated performance or efficiency gains through ephaptic integration, contributed to at least one publishable research milestone, and delivered production-ready frontier model capabilities integrated into the Ephapsys platform.
Why Join Ephapsys
- Shape the future of neural architecture research
- Operate at the frontier of adaptive and secure AI systems
- High research autonomy and technical ownership
- Early-stage equity participation
- Direct collaboration on foundational AI innovation
To apply, email jobs@ephapsys.com with your resume and a short note on why you are interested in this role.
