2 PLUGINS
1 WEB APP
Foonts
Foonts
Foonts
AI tools designed and developed for designers
goal
Streamlining my workflow with AI:
1. A plugin that converts a frame into grayscale
2. A font randomiser showcasing font pairings from the 1800+ Google font library
3. A plugin that exports sections and subsections as nested folders
my role
Conceptualised projects, Created PRDs, Linked APIs, Communicated with AI, Fixed bugs
team
Cursor AI (Developer)
ChatGPT (PRD builder)(later kicked out after I discovered a better workflow)
Foonts
Foonts
Foonts
Fonts
(Organised from most recent to oldest)
> tool 1: Foonts
Inspired by Coolors.co: creating a font pairing generator to discover the 1800+ Google Fonts Library
Visit the working tool here
> why?
Google Fonts has an incredible library of fonts, but you might not know that.
I love fonts, and I love discovering new, beautiful ones for new projects. But the typical dropdowns, and even the Google Fonts website, don't make scouring through thousands of fonts easy. (a little secret: Figma doesn't show the entire Google Fonts library, either)
I mean, have you heard of:
Bricolage Grotesk?
What about Rethink Sans?
Mona Sans?
Geist?
Didact Gothic?
No?
I want to simulate the delight of stumbling upon unknown new fonts in a single tool. How?
Just like this!
Coolors.co is a tool that designers adore. Discovering beautiful colors, or beautiful fonts, I saw parallels and decided to give it a spin.
I tested Lovable, Replit and V0 for this. Cursor's final product worked with the least errors, and loaded all fonts perfectly
plugged the google fonts api
implemented the features
dark mode
Creating dark designs? See how the fonts look in it.
font lock
Like a font? Lock it to randomise only the other font
weight and size control
All fonts have their own custom size and weight. Change it how you want it.
What's next?



fixing the developed ui

implementing font type filters

Multiple features + fixing
> tool 2: Grayscaler
A figma plugin that converts frames to grayscale
1.1
Grayscaler in action
This was my launchpad to figure out how AI works.
Learning AI communication took me one week
Actually making it took me <45 minutes.
In those 45 minutes, this ↓ is how many prompts it took me to get to the final working product.
What did not work(with Cursor AI, at least)
Hefty Product Requirements Documents confused both Cursor AI and me.
As someone without coding experience, I needed to make sure I still had granular awareness + control of what the AI was doing, how the code was working and what was not working.
What worked
A patient, step-by-step process where I implemented new features one by one. Here's how I talked to the chatbot. ↓
Create a feature that grayscales frames with solid fills.
Now, do that for gradient fills.
That worked great. Can you do the same for borders?
Do this for all nested frames too
… and so on, until all asset types were addressed
This allowed me to understand what each part of the code was actually doing, and made sure I understood where the errors were happening, and how I could fix them.
© 2025 Nandini Vyas. All rights reserved.
Made with, well, AI.