Open to Opportunities

Ranjeeth Burujula

Software Engineer

A work-in-progress engineer with good instincts and a long list of problems I'm excited to tackle.

I'm a Computer Science student at Manipal University aiming to grow into a solid software engineer by working on real systems and real problems—not just code that looks good on a slide. I like building things, occasionally breaking them, and figuring out how they actually work.

I'm happy jumping into different corners of tech—wherever there's something meaningful to build or fix. I learn fast, don't ghost when things get hard, and work well with people who care about what they're building. Messy technical problem on the table? Consider me curious.

10+
Projects
10+
Certificates
3+
Companies Worked
Featured Projects

Selected Work

Building intelligent systems and scalable solutions with cutting-edge AI/ML technologies

from transformers import BertTokenizer
model = BertForSeq2Seq.from_pretrained("bert-base")
outputs = model.generate(input_ids, max_length=128)
bleu = corpus_bleu(refs, hyps) # ~85
NLP • PyTorch • Hugging Face

BERTlingo

English-to-German translation using fine-tuned Transformer models with multi-head attention. Evaluated with BLEU score ~85.

View on GitHub →
@hydra.main(config_path="conf")
def train(cfg):
    model = GPT2LMHeadModel.from_pretrained(cfg.model)
    trainer = Trainer(model, deepspeed=cfg.ds_config)
LLM Training • PyTorch • DeepSpeed

HF Pipeline

Training pipeline for GPT-2 and BERT using Hydra config management and Weights & Biases tracking across 15+ runs. DeepSpeed multi-GPU support.

View on GitHub →
model = torchvision.models.resnet18(pretrained=True)
model.fc = nn.Linear(512, 10) # CIFAR-10
transforms.Normalize(mean=[0.485], std=[0.229])
acc = evaluate(model, test_loader) # 94.2%
Computer Vision • ResNet • CIFAR-10

DeepVision

Image classification on CIFAR-10 using ResNet-18 with data augmentation, normalization, and training visualisation for architecture comparison.

View on GitHub →
func GenerateAPI(schema *Schema) {
    routes := ParseSchema(schema.Tables)
    WriteHandlers(routes, "./output")
    // Cuts dev time by ~70%
}
Go • PostgreSQL • Code Generation

GoInit

CLI tool that auto-generates REST APIs and React UIs from database schemas. Supports PostgreSQL, MySQL, SQLite. Cuts development time by ~70%.

View on GitHub →
const api = new RestApi(this, "FileAPI")
const fn = new Function(this, "Processor")
bucket.grantReadWrite(fn)
table.addGlobalSecondaryIndex(...)
AWS • React • TypeScript • CDK

CloudFileProcessor

Full-stack file processing platform with React UI, AWS CDK backend — API Gateway, Lambda, S3, DynamoDB Streams, and EC2 batch processing.

View on GitHub →
router.post("/users", authenticate, async (req, res) => {
  const user = await User.create(req.body)
  await sendVerificationEmail(user.email)
  res.json(generateJWT(user))
})
Node.js • Express • MongoDB • JWT

NodeRestified

REST API with 12 stateless endpoints, JWT auth, email verification, Gravatar integration, and Jest unit tests in a mock CI/CD pipeline.

View on GitHub →
SCROLL →
Skills & Credentials

What I Know

Click a category to explore

Languages
Python
JavaScript
TypeScript
Go
Java
C / C++
SQL
Bash
R
AI & Machine Learning
PyTorch
TensorFlow
Hugging Face
scikit-learn
OpenCV
LangChain
RAG
Transformers
DeepSpeed
Weights & Biases
Hydra
MLflow
YOLO
Diffusers
Frontend
React
Next.js
Vue.js
HTML5
CSS3
Tailwind CSS
Three.js
GSAP
Backend
Node.js
Express.js
FastAPI
Flask
Django
REST APIs
GraphQL
JWT
OAuth 2.0
WebSockets
Databases
PostgreSQL
MongoDB
Redis
MySQL
Snowflake
DynamoDB
Firebase
Vector DBs
Delta Lake
SQLite
Cloud & DevOps
AWS
Docker
Kubernetes
CI/CD
GitHub Actions
Terraform
Linux
Snowpipe
ETL
Playwright
Jest
Experience

Where I've Worked

Software Engineering Intern
Spur — New York, NY
  • Worked on the AI team to increase the speed of the observability platform, supporting 500+ agent test runs daily.
  • Reduced dashboard data latency by optimising PostgreSQL queries, adding targeted indexes, and implementing Redis caching.
  • Built a Random Forest classifier in Python to classify failures, cutting weekly log time by ~60%.
  • Developed a Dockerized Playwright tool that read AWS S3 logs and replayed recorded failures in a sandbox for faster debugging.
PythonPostgreSQLRedisAWS S3DockerPlaywright
Data Engineering Intern
Cred — Remote, CA
  • Worked on the Data Platform team to remove duplicate ad records from the ETL pipeline that were inflating campaign cost reports.
  • Removed duplicates using Delta Lake, reducing duplication from 30% to under 2% and correcting cost reporting errors.
  • Validated Snowpipe pipelines by testing backfills and monitoring load errors, improving data quality monitoring to reduce manual correction time.
Delta LakeSnowflakeSnowpipeETLData Quality
AI & ML Engineer Computer Vision Specialist Full Stack Developer AI & ML Engineer Computer Vision Specialist Full Stack Developer