March 9th, 2024

New

Introducing Beta Release!

We've been rigorously working behind the scenes for the past six months, and today, we are thrilled to unveil cveradar beta-versionβ€”a cyber tool designed to empower your day-to-day work!

πŸ¦‡πŸ“‘ What's with the hype and what was the problem? πŸ”₯πŸͺ²

If you are working in cybersecurity you are probably already aware that the number of vulnerabilities is on an uphill trend and are becoming increasingly difficult to manage. Cyber-businesses or cyber-people are generally relying on either off the shelf solutions or internal tools to reduce the noise and to focus their efforts on vulnerabilities that actually pose a threat. Various vendors provide diverse solutions to surface relevant vulnerabilities, but there has always been a missing 🧩 during the prioritization phase...

And thats where we come in with our secret sauce! πŸ₯«πŸ…

We have built a proprietary solution custom tailored to assess vulnerability popularity (relative to each other) based on the global web activity.

4️⃣♻️ --> fingerprinting occurs twice / day (every 12h) and based on the latest fingerprint results, we run an additional, smaller fingerprint for Top 50 (top 50 threat + top 50 social).

#1 large -@12h

#2 small -@6h

#3 large -@12h

#4 small -@6h

🌏🌎 --> fingerprinting occurs on multiple regions and pokes unlimited sources for relevant signals. We aim to provide a "global average". This also helps keeping the false positive rate lower.

🏎️🏁 --> fingerprinting is blazingly fast. The faster it is, the more accurate the results.

πŸ•œπŸ’Ύ --> fingerprinting returns results within a specified timeframe. That's how we are able to provide relevant statistics and calculate averages.


We built the solution from ground up and even though we encountered multiple issues during the design and implementation phases, we are now happy with how things are and decided to open the gates.

πŸ†˜ How does this help you? 🀝

The utmost value cveradar brings to the table: staying on top of security incidents. Instead of manually hunting for popular vulnerabilities on specialized technical blogs or platforms, we now do this for you, automatically based on unlimited sources. Feeling as clueless as a goldfish in the dark on where to begin your vulnerability prioritization? Time to upgrade from dial-up confusion to broadband clarity!

Log in...check the aggregated views...take data-driven decisionsπŸ“ˆβœ…

🍎One more thing! ☒️

While designing and implementing the product, we realized something... Social activity data was indeed a missing puzzle piece, but it is not the most important criterion when it comes to vulnerability prioritization (although equally important as other criteria).

Thanks to the amazing people @metasploit.com, @cisa.gov, @exploit-db.com, @trickest.com, @first.org, @nvd.nist.gov, @cve.org who decided to open-source valuable vulnerability data, we were able to create a custom formula that outputs threat points (not to be confused with risk) and provides holistic insights.

round(social_percentile) + 
max(nvd_score) * 10 + 
100 if PoC else 0 + 
100 if Automated Exploit else 0 + 
100 if Exploited in the wild else round(epss_percentile)


While this formula may not be in its final form, we believe it effectively serves its purpose. Each parameter is designed not to exceed 100 points, ensuring equal weight for each. We hope that this will serve as an extrinsic (non-environmental) baseline for prioritization, allowing users to incorporate additional intrinsic (environmental ) criteria on top.


🌟 What are the most important features? πŸ†

  1. Spot trending vulnerabilities as well as vulnerabilities posing elevated threat

  2. Configure alerts based on custom parameters

  3. Historical data, fresh data and accurate-'ish' data

  4. Intuitive card-based UI. Toggle between social/threat, date picker, detailed view

  5. Roadmap is driven by the community

⚠️ Acknowledging the limitations 🚧

The product isn't perfect...yet.

We only support CVE-2024-XXX at this point. This introduces a partial gap in the data, as for example CVE-2023-xxx might still be published in the 2024 calendaristic year. Using the cve published_date instead might be a better approach, however we first need to understand what would be the performance hit.

Social activity data is not 100% accurate and probably will never be. This is because we cannot control the accuracy of the data (at least at this point). More data, better accuracy. Utilizing the aggregated views will help flatten any abnormal spikes or gaps in the data.

*Data from January and February is less accurate as we applied multiple tweaks to the fingerprinting technique along the way.

We only provide data for cve_status != reserved as including reserved CVEs will put unnecessary strain on the performance.

As more and more users find the product useful, we might run into some SQL performance issues.

Alerts do not work with Top 50 data (only work based on the data collected by the large fingeprints)

πŸ”­ Whats on the horizon? πŸͺ

For the upcoming months we will be focusing on:

  1. improving data accuracy 🎯

  2. improving scalability πŸ› οΈ

  3. listening to feedback πŸ—£οΈ


πŸ’‘We already have some ideas for the longer term. Dive deeper into cve statistics (e.g. how many cves with X status, how many cves with X cwe?), making the data available via API, cve hall of fame, technical blog posts for top cves, additional sources or pulling data from specific APIs such as X, shodan integration, and more...

Let's validate the product and understand whether it brings any value to your day-to-day activity. Does it save you some headache?

We are thrilled to have reached this moment and are eager to hear your thoughts on the initial set of features. Do share with us any feedback you might consider relevant or any bugs you encounter!



Built with πŸ’› by a team of 2 πŸš€

πŸ™ Credit πŸ‘

Data sources:

Web themes: