Application Security This Week for February 28

by Bill Sempf 28. February 2021 13:23

Portswigger published their Top 10 Hacking Techniques for 2020.


Vulnerabilities in malware!


Github is doubling down on security tools, which I think is awesome.


Have a great week!


Application Security This Week for February 21

by Bill Sempf 21. February 2021 14:48

Microsoft has some guidance for containers using .NET


Another interesting dependency management tool, but this one if for Python!


AWS isn't the only cloud that has blob storage permission problems.


Have a good week!


Application Security This Week Valentines Day edition

by Bill Sempf 14. February 2021 12:45

Apparently I failed to publish last week. Sorry about that.


Rolling shellcode from objects in memory.


The Swiss say they can break encryption using quantum computing.


Remember how everyone has been warning about internet-connected industrial control systems?  Whelp.


Look, more supply chain attacks!

In related news, I'll be speaking on the topic at the Cincinnati Security Users Group on Thursday


Oh look!  Another one!  We might have a trend here.




Application Security This Week for January 31

by Bill Sempf 31. January 2021 13:26

Using Machine Learning to perfect SQL Injection

And some practical application of that idea


Didier has a new PDF tool out.  I haven't used it yet but I am certain it is awesome.


OK, this is a weird one.  It appears that threat actors are using project files with built-in vulnerabilities to target the vulnerability researchers themselves, apparently to steal their research.  That's some next level stuff.


Application Security This Week for January 24th

by Bill Sempf 24. January 2021 12:58

A very Interesting list of exploitable "features" in PDFs.


There have been a lot of attacks on Azure's authentication system recently - some of which were even in this newsletter.  Sparrow helps you smoke out vulnerable instances.


Didier has been a regular in this newsletter, and he has updated his tool to support more encoding. Very cool stuff.


Have your kids test your apps.


Stay safe out there.


Application Security This Week for January 17

by Bill Sempf 17. January 2021 12:36

Breakdown of a malicious app that man-in-the-middled the Google Signin.


Good Wired article about tools the fibby uses to get around smartphone encryption.


Oh man, cross-origin images and data leakage.  Certainly adding this to my manual testing.


This has been patched, but a really good explainer on how the RCE in Office 365 was discovered.


Using game hacking to explain the danger of unsigned code.


Have a great week folks!


Application Security This Week for January 10

by Bill Sempf 10. January 2021 13:02

Hey, welcome back from holidays.  Quite a week it has been.


Portswigger has a really good writeup of OAUTH 2 vulnerabilities.


This isn't so much appsec, but it is really interesting code that hacks a game - Cyberpunk 2077 minigame resolver.


SolarWinds just keeps on giving.


Keep on keeping on, folks.


Application Security This Week for December 20

by Bill Sempf 20. December 2020 13:40

So, hey, yeah, how are all of you.  Clearly SolarWinds has completely overwhelmed the news this week, so I have a couple of notes about that. To those of you who are having to deal with this, I am with you in spirit. Doing what I can here from The Bunker to help you out.


Here was my first indication there was a problem, I believe.  It's pretty old news now.

I spoke about Supply Chain problems at the Central Ohio .NET Developer's group in March.  Oddly timed.

MicroSolved has a good writeup you should read.

This is Microsoft's breakdown on DLL Injection.  For the record, I attended a BoF session on this at DefCon 15(!) and everyone I talked to blew it off.  Guess not.


Some other news, thank goodness.


Github is gonna ban passwords.


The NSA finally figured out that authentication systems are under attack.


And finally, a short article about memcpy.


That's the news, folks, have a great holiday and end-of-year. May your systems be secure and your code be frozen.



Application Security This Week for December 13

by Bill Sempf 13. December 2020 13:20

There is a potential new addition to DNS security, which is sorely needed.


A good writeup on discovery of a Facebook vulnerability.


I am not in favor of brigading FireEye, and if you are I'll fight you.  That said, the analysis of the stolen tools is very enlightening.


That's the news, folks.  Stay safe.



The Trouble With Teaching Secure Coding

by Bill Sempf 9. December 2020 00:00

Once a week or so, someone calls and asks for OWASP Top 10 testing.  I have to make the call on the spot weather or not to explain that isn't what they want, or say "Sure!" and then give them actually what they need, or have a larger scale meeting to see where their appsec maturity is, and base training on that.  Usually it is the third.

The problem is, app security is hard to teach, and frankly many shops need secure coding training, which is even harder.  Let's break down why that is the case.

OWASP Training yes, OWASP Top 10 training, no

 OWASP is a great organization.  For those unfamiliar, it is a global nonprofit with the mission of evangelizing application security to developers.  It has it's political problems sure, but in general it solves a very hard problem with grace and clarity.

One of the most famous products to come out of OWASP is the Top 10.  This list is the most risky vulnerabilities discovered by member organizations, ranked.  It is a useful.  Useful for printing out, rolling up, and smacking your CIO with until you get a security budget. 

The OWASP Top 10 is not an application security plan.  It is also not a training curriculum.  It is a marketing vehicle, and a remarkably effective one. Use it for that and you are golden.  Try and do an OWASP Top 10 training, and you are performing a disservice.

This discussion doesn't go over well with most.  Everyone wants a magic bullet for application security, but there just isn't one.


The plan is simply to do three things:

1) Teach the developers to recognize security flaws.

2) Teach the developers to repair the security flaws.

3) Give the developers tools to prevent security flaws from ever making it in the code.

Let's count 'em down.

When you need to learn how to test apps

 Let's be straight here.  The only way to make applications more secure is to code them securely.  Okay pokey? Good, that's settled.

Now.  There are a few things that need to happen first, and therein lies the rub.  CIOs and Dev Leads want to drop a process in place that will secure their code.  Then I stop by, put '; in their website search field, blow the whole thing up, and get the O face from the team. First, we need to show developers what the attacks are, and how to check for them.

The issue among the high level development security instructors is that they are so far along in their personal skill set that they wanna talk about indepth output encoding for style sheets, without realizing that many developers are still wondering what the other site is in Cross-Site Scripting anyway? I get it. I do.  But we gotta judge that audience, and it's rough. Average 40 person dev team you are gonna have 7-8 people that already know the basics, but not well enough to teach the other thirty-odd.  We need to start there.

Security champions - I love you all very much. Take a look at your dev teams. Close your eyes.  Take a deep breath.  Open your eyes.  Does everyone in there understand JWT risks? Does your organization remove dev names in comments? If not, you need to run an application security testing class.  No, you don't have to have everyone be an uber-hacker. But it is fun, and it does give everyone a starting point.

When you look at code in code review, ask what input validation is being done. Ask about how that viewstate is encoded.  If you get a glassy eyed stare, then consider a class on testing.

When you need to learn how to write secure code

 Once folks can recognize insecure code, it is time to start fixing things. Sounds far, far easier than it actually is. However, this is when we need to start getting the development staff into the process of building security into their everyday process.

My experience is that you need to do a few things.  First, static analysis.  It isn't perfect, but it is a start.  Static analysis is the process of analyzing the code to best determine the potential security flaws. Dynamic analysis is the act of looking at the flaws in a running application. Either can be automated - meaning a script does the work - or manual - meaning a human does the work.  Automatic static analysis, say with a tool like SonarQube, is very likely to generate a ton of false positives at the start, but the rules can be honed over time. It is an imperfect but fairly effective tool.

Another important tool that should be used is a secure coding standard.  This is a custom built document not unlike a style guide.  It is something you can hand to new devs and say "this is how we do things."  Now, this leads well into the next section, about language agnostic testing and training, because the secure coding document should be tailored to the platform used by your organization. 

Testing is language agnostic, but secure coding isn't

The issue, as one discovers writing a secure coding standard, is that testing is very platform agnostic, but writing more secure code is not.  From a tester perspective, I can say "you need to encode your outputs" but from the developer perspective, there is a different way for every language and platform.  Html.Encode()? Sanitize()? Different everywhere, and a few frameworks do the work for you.  

When the report is written, there should be remediation advice, and it should have detailed guidance.  However, that means the tester should have detailed information about the language and platform and framework used to build the tested application.  This is extremely unlikely.   

When trying to teach generally, there needs to generally be an expertise in the language, platform, and framework.  Now, some folks know several languages, platforms, and frameworks,, if they have been around a while.  I for instance know C# and ASP.NET on Windows, Java and JSP on Apache, and Python with various frameworks quite well. Others less so.  But I have been doing this a long, long time.  Teaching secure coding in Ruby on Rails requires a specialty in appsec, AND Ruby.  That's not an everyday set of skills.

So what are we gonna do?

 Whatcha gonna do?  It's not the easiest problems to solve. I have a system that I would like to share, though.

First, have someone give a security talk at your company.  Usually, I do a lunch and learn or something, obviously online these days. Go over the vaunted OWASP Top 10, or give a demo of Burp or ZAP. Heck, click F12 and see what you see. I usually invite developers, business analysts, and testers (quality assurance, whatever your term is). Some people will nap through it, some will stay after to ask questions.  Those people that stayed after might very well be your security champions.  

OK, so now we know who is interested.  Second, we do training on testing. Have the security champions help to collect together the folks they think are important to understand what the vulnerabilities are, and hold a real training - one or two days, with labs - on application security testing.  This gets the core group of people the information they need about vulnerabilities to look for.  In the labs, have them look for them.  In their own code.  Encourage folks to test their own dev instances.  Dig in.

Third, retrospective.  Get the champions back together.  What did we learn?  How can we do better?  Most important, what are the secure coding principles that must be taught?  This is where we solve the language agnostic issue.  You can't just call someone in to teach secure coding, you must learn what it means to you and your team and your company.

Fourth, write a secure coding standard. It should be based on the lessons from the retrospective.  Base it on the categories of vulnerabilities, but couched in developer terms.  I use:

  1. Security Principles
  2. Access Control
  3. Content Management
  4. Browser Interaction
  5. Exception Management
  6. Cryptography
  7. System Configuration

But your mileage may vary.  The goal is to build a guide you can give someone on their first day.  Say "We write secure code here.  This is how it is expected to be done."  Think that through.  Usually my documents are 12 pages or less.

Finally, you train the secure coding standard.  Now you know what needs to be trained.  Yes, you have to write the materials, but they can be reused.  It can be as long or as short as you like but you get everyone back together and teach.  Then, as new people join the team, you have the culture in place to hand them the document.

Next, if you want to, you start to enforce the standard with a static analysis process.  That, however, is for another post.



Husband. Father. Pentester. Secure software composer. Brewer. Lockpicker. Ninja. Insurrectionist. Lumberjack. All words that have been used to describe me recently. I help people write more secure software.

Find me on Mastodon

profile for Bill Sempf on Stack Exchange, a network of free, community-driven Q&A sites