Tuesday, May 23, 2017

Cloud computing, agile software development and DevOps are a holy trinity

Agile methodologies, DevOps and Cloud technologies are sometimes referred to as a holy trinity of the modern software development world. One of the promises of the Agile approach was to make software release cycles shorter. Gone are the days where software firms could have major releases every 2 years (and perhaps one or two service packs in between). Competitive pressure requires a much higher delivery frequency. Product owners started adopting hypothesis-driven approaches to quickly validate various assumptions and close feedback loops.

To support fast-pace environments, organizations needed to reduce friction between different teams as much as possible. Around 2009 DevOps was born. It is a way of thinking. It is in a way a philosophy. It brings different teams together in a cohesive unit focusing on end-to-end delivery. By eliminating human elements and automating as much as possible DevOps has changed the focus from traditional questions of stability and efficiency towards agility and unlocking teams’ innovation potential.

To be successful in innovation, organizations have to rely on creative thinking and engineering talent of their staff as the key inputs. Humans are good at thinking. Engineers enjoy solving new problems. All mundane, repetitive tasks can and should be automated. These tasks are still a necessary part of the overall software development cycle but it has become obvious that human talent is wasted performing these tasks. Ask QA engineers how they feel about performing manual regression tests. Automation of the repetitive tasks is one of the DevOps deliverables. This way organizations gain pace, reduce the risks of human errors and allow their engineers to focus on the important things.

A/B testing, hypothesis validation, faster feedback loops… They all require quick provisioning and elasticity. In the traditional IT world, it could take weeks to provision a new server. Software delivery teams couldn’t and didn’t want to wait that long. And, what if we need this server only for a couple of weeks to quickly validate an idea?!

This is how cloud was born. Public cloud offerings are becoming mainstream. More and more companies find it tempting to rent computer resources (OPEX) instead of investing in their own hardware and data centers.

A public cloud model is perfect for a start-up. More mature business with an existing hardware footprint quite often in order to protect their existing investment choose the hybrid model. This model offers the best of both worlds. Legacy applications will continue running in the existing datacenters (private cloud) with the ability to overflow into the public cloud. The public cloud’s elasticity can provide immediate access to the required resources where it makes sense.

There might be several reasons why a company might decide to go down this path. It could be a hard to maintain a legacy application that would be nearly impossible to migrate. Or they might be unwilling to move sensitive data or mission-critical applications as the risk of doing so might outweigh the perceived benefits of the public cloud. Private cloud or datacenter infrastructure can continue running static workloads (the ones that don’t require the adaptive capabilities of the public cloud), providing additional security benefits, or meeting the regulation requirements, where sensitive data cannot be stored in certain geographies.

Modern software development practices force us to break legacy monoliths into the independent microservices. This is where some of the newly written code can be easily moved to the cloud.

It is a mistake to think that DevOps is really just "IT for the Cloud". DevOps can really shine for companies considering the hybrid cloud approach.

What are the key DevOps considerations when running in a hybrid cloud environment?


Define your technology stack and stick to it


Avoid creating a "zoo of different technologies". I often use this term to describe a situation when different teams select similar but different technologies (e.g. MSMQ, SQS, Redis as queuing mechanisms) that provide roughly the same functionality. Identify key “building blocks” your application(s) consist of and socialize this fact with everyone in the team. For example, if a front-end team decided to use Angular then this should become your weapon of choice for some time. New javascript frameworks get released every 6 months. It will be a mistake to chase “the latest and greatest” flavor of the day.

The "sticking to it" part doesn’t mean stagnation. But every proposed change should be justified – what will the new technology give us compared to what we already have. If you decide to go ahead, then aim to replace existing technology, not to run the new one in parallel. A short overlapping period is fine but running 2 similar technologies for a long time is what creates a hard to support "zoo" and eventually results in increased technical debt.

Seek feature parity between the on-prem and cloud-based deployments


When running in hybrid mode it might be tempting to choose cloud-native technologies in the cloud part of your infrastructure – for example technologies like SQS for simple queuing. But in most cases it would be a mistake. You will break the parity principle, meaning that now you must support two different technologies. Two parts won’t look the same, you would have to deploy them differently and testing would most likely be different too. Consequently, it is better to select technologies that can be deployed into both parts of the hybrid solution.

Abstract the underlying components and infrastructure from your application


If the temptation to use cloud-native technologies is still too high, then at least make an effort to architecturally abstract these pieces into a separate layer. Avoid making direct calls to these components from all around the application. Instead, implement a thin abstraction layer as a single entry point. Provide this functionality as an easy to consume and integrate library/module. By doing this you will make it easier to upgrade and replace the underlying technology later on. This will also reduce your reliance on a particular cloud provider (“vendor lock-in”) making your overall solution more portable.

Avoid manual changes at all costs


Consistency is key to operate large/complex fast pace environments at scale. You can’t afford having unique snowflakes in your environment. Amazon’s "cattle, not pets" mantra means all of our individual assets should look the same. Any changes to the setup or configuration should be via the orchestration layer. When troubleshooting production issues, it might sometimes be tempting to login to a server and quickly make changes manually. This is driven by a false sense of convenience. If making changes automatically sounds slow or inconvenient then it’s a sign that your automation/orchestration toolset needs to be improved. This leads us to the last point.

Select the right automation platform


It is important to select the right tools to automate and manage your infrastructure. These tools should be able to control both the on-prem and the cloud parts providing a seamless, "single pane of glass" view of your infrastructure. They should be able to speak the "native language" of the cloud platform and provide the necessary level of convenience for the DevOps team to use them as the primary (and only!) way of maintaining the environment.

Keywords: Agile, DevOps, Cloud, "hybrid cloud", "public cloud"

Sunday, April 9, 2017

The mysterious "J" instead of a smiley face

I am sure many of you received emails with a "mysterious" capital "J" character in the sentences like "Thank you J". And I am sure many of you know the answer. But just in case you are still wondering what this means - here is the answer.

You may have figured out from the context that "J" stands for a smiley face. And you will be right. But why "J"? After all it doesn't really resemble a smiley face.

What does J mean in the email messages?

I will use a short email that I received earlier this week as an example:

Looks absolutely fine in my Outlook 2016. But in a different mail client the smiley face is replaced with a capital "J". Let's look at the actual HTML code of this email:

<body bgcolor=white lang=EN-US link="#0563C1" vlink="#954F72"><div class=WordSection1><p class=MsoNormal>Team effort! Strength in numbers <span style='font-family:Wingdings'>J</span> <o:p></o:p></p><p class=MsoNormal><o:p>&nbsp;</o:p></p>
Interesting! Where Outlook renders a smiley face we actually have a span tag, that switches to the Wingdings font. And that span contains a single capital J character.

<span style='font-family:Wingdings'>J</span>
To understand what's actually happening here we need to launch a Character Map application built into every version of Windows.

Let's select the Wingdings font there and type a capital "J"


Wingdings font - capital "J"

See the smiley face there? Now it all makes sense. It's Outlook's way of embedding a smiley face into the email body. I've seen this issue for quite some time, so this decision was made most likely before a wider adoption of the Unicode. Those email clients that support both HTML and the Wingdings font will render a smiley face properly. Sometimes HTML tags might be stripped out, encoding may change from server to server, Wingdings may not be supported etc - in those cases the end result will be a capital "J".

Note - this is not the same smiley face that is corresponding to a Unicode character U+263A (decimal 9786). This one looks like this: ☺

Unicode character U+263A - smiley face

So, now you know what this mysterious "J" means and where it comes from. Hope you've enjoyed this quick investigation. 

Monday, February 20, 2017

Australian Banks Security (HTTP headers edition) - Feb 2017

Back in 2015 I wrote 2 blog posts where I examined the security posture of the major Australian banks. I have only focused on two aspects - HTTP security headers (the presence or absence of particular headers) and the login forms (password lengths, autocomplete etc). On one hand, this is not an in-depth research and it is certainly not a vulnerability assessment that I am sure all these banks regularly go through. On the other hand, it is a great indication if bank's development and security teams follow modern security practices and put enough effort into their online security. This may serve as an indirect indication of the overall security state of affairs in a given organisation.

I was curious to see if there were any changes (for better or for worse) during these last 2 years. HTTP security headers have really become mainstream and I expected the adoption rates to be higher.

TL;DR

Image source: http://blog.kulshitsky.com

Scott Helme continued to evolve his great Security Headers web site that I used during my previous analysis. Similar to the Qualys SSL Server Test tool, he has added an overall rating, which I will add as a new column. Another nice addition is a new check for the Referrer Policy headers. If you haven't done it yet, make sure you go to Scott's securityheaders.io site to check HTTP headers emitted by your web site. Let me know if you need any help understanding or addressing any of the highlighted issues.



Let's see what Australian banks do in regards to HTTP security headers in February 2017

Results


Bank
Score
SecurityHeaders rating
Strict-Transport-Security
Content-Security-Policy
Public-Key-Pins
X-Frame-Options
X-Xss-Protection
X-Content-Type-Options
Server
X-Powered-By
X-AspNet-Version
IMB 5

(+5.5)
C Yes No No Yes, DENY Yes Yes No No No
Bank West 4

(-1)
C Yes Yes No Yes, SAMEORIGIN Yes No No Yes No
Beyond 3

(+3)
E Yes No No Yes, SAMEORIGIN No No No No No
ING Direct 3

(+2)
E Yes No No Yes, SAMEORIGIN No No No No No
St George 3

(+2.5)
E Yes No No Yes, SAMEORIGIN No No No No No
Bendigo Bank 2

(+2)
E No No No Yes, SAMEORIGIN No Yes No No No
Teachers Mutual 2

(+4)
E Yes No No No No No No No No
CUA 1.5 E Yes No No No No No Yes, CUA Server No No
Commonwealth Bank 1 E Yes No No Present but incorrect syntax ALLOW-FROM No No Yes,
Apache/2.4.6 (Red Hat) OpenSSL 1.0.1e-fips
No No
Newcastle Permanent 1

(+2)
F No No No Yes, SAMEORIGIN No No No No No
People's Choice Credit Union 1

(+1)
F No No No Yes, SAMEORIGIN No No No No No
P&N 1

(+3)
F No No No Yes, SAMEORIGIN No No No No No
Suncorp 1

(+1)
F No No No Yes, SAMEORIGIN No No No No No
Westpac 1

(+1.5)
F No No No Yes, SAMEORIGIN No No No No No
AMP 0.5 F No No No Yes, SAMEORIGIN No No Yes, IBM_HTTP_Server No No
ANZ 0.5

(+2.5)
F No No No Yes, SAMEORIGIN No No Yes, Apache No No
Bankmecu -> BankAust 0 F No No No No No No No No No
Greater 0 F No No No No No No No No No
Heritage 0 F No No No No No No No No No
Macquarie 0 F No No No No No No No No No
Bank of Queensland -2 F No No No No No No No Yes, ASP.NET Yes, 2.0.50727


Key findings

  • Significant improvements over the last 2 years
    • Only 1 bank is in the negative territory (previously 7)
    • 7 banks have a score of 2 or above (previously only 1)
  • Better adoption of security headers (group 1) by the banks.
    • X-Frame-Options is the most popular header. 13 out of 21 banks (62%) have adopted it (previously only 4). I guess more security professionals recognise clickjacking being a real weakness.
    • Great to see 8 banks out of 21 (38%) using HSTS (previously only 2)
    • But not everyone who emits the HSTS header includes subdomains (includeSubDomains)
    • And even less number of banks use the "preload" directive (which is a required step for HSTS preloading) - only CBA
  • Content-Security-Policy is still not getting any traction. Only one bank - Bank West - has implemented CSP. CSP is a poweful defence-in-depth measure to prevent cross site scripting attacks, clickjacking and some other types of attacks.
  • The situation with the group 2 headers is even better. Many banks that were in the second half of the table lifted their game and removed these unnecessary headers. Only 6 banks out of 21 still need to fix this issue (previously 12).
  • There is still a long way to go.
    • No one uses public-key-pins
    • Only 2 banks serve the X-Xss-Protection header. This is the simplest and essentially zero risk header to implement!
    • Only 2 banks use the X-Content-Type-Options header (previously none). This is another extremely simple header to implement.
    • Understandably no one uses the Referrer Policy headers yet.

Additional comments

BankAust redirects from home page to a non-secure page. Why? Please fix this.

CBA made a mistake in X-Frame-Options ALLOW-FROM syntax. There is no need for the equal sign there.

Previous winner Bank West was the only bank that has managed to get a lower score. One point was deducted for the presence of the X-Powered-By header. It's a simple mistake to make. It usually "returns" after a .Net patch installation.

We have a new leader. Congratulations to the IMB bank. They made a massive jump (+5.5 points) fixing all of the issues and introducing many of the recommended HTTP security headers. Well done!


Sunday, February 12, 2017

Obscure Windows commands and features

My previous blog post has become quite popular crossing the 10,000 views mark in just a few days. Given such interest I decided to share a few more useful commands as well as some obscure tricks that I came across over the years.

hh.exe - HTML help

hh.exe has been part of Windows for a very long time. What makes it fun though is that it supports external URLs, so we can make calls like this:

hh http:\\google.com    (notice the backslashes)




This made me curious - what is the user agent string for this "browser"? Running it on a Windows 7 machine:

hh http:\\whatismyuseragent.net


Let's just collectively exhale "Wow... IE7" ;)

But what about Windows 10 I hear you ask. There it is:


Same thing - IE7. Of course this is not the real IE7 - Trident/7.0 is a give away. This is IE 11 running in IE7 compatibility mode. But I still find it funny.

mshta.exe exhibits the same behavior except for the old style navigation bar. But I find the retro "IE6 style buttons" look way more amusing.

I mentioned another effect of HH in one of my earlier blog posts - even if you disable Adobe Flash player in your browsers, it will still be there (as demonstrated by opening a web page in HH).

"God mode"

You can create a directory with a specific GUID extension to enable the so-called God Mode. No, you won't get the BFG 9000. Instead, Windows will populate this directory with a LOT of tools and various management options - all in one spot. There will be nothing new that you haven't seen before - just an interesting way of presenting all of these tools in the same place.


To do this - just create a directory with this specific name ("GodMode" part can be anything but the GUID part is important)

md "GodMode.{ED7BA470-8E54-465E-825C-99712043E01C}


Yet another Recycling Bin?


In case one is not enough ;)

md YetAnotherRecycleBin.{645FF040-5081-101B-9F08-00AA002F954E}


Where-ever you create this directory, it will act as a Recycling Bin. It is the existing recycling bin - i.e. if you already had a few deleted files you would be able to see them in the "new" bin too.

View Reliability History

Buried deep inside, the reliability monitor allows you to have a quick look what happened to your system recently that may have a potential impact on system reliability.

To find it - just click Start -> Run -> and type "reliability"
Or you can choose a more convoluted path: go to Contol Panel -> System and Security -> Review your computer's status and resolve issues -> Maintenance


and then click on "View reliability history" (at the bottom of this picture)



This is probably something that you have never used before. And yet it certainly is a useful representation of what happened on this system recently.

Notepad Log

Did you know that notepad can append a date/time every time you open a document? In fact, this feature has been around for a long time but not that many people know about it.

Just create a new text document and put ".LOG" (without the quotation marks) at the top.



Now every time you open this file, you will see a new timestamp automatically added by the notepad.

Stored Credentials (aka Windows Vault)

Run this command:

control keymgr.dll


You will be able to see all currently stored credentials (and perhaps delete the unused or sensitive ones)

You can use a command line equivalent:

cmdkey /list





When was the computer rebooted last time?


Run this command:

net statistics server 

and check the "statistic since <date time>" line

Find which application will open files with a particular file extension

I know that usually people just search through the registry to get this information. But there is a simpler way to achieve this by using 2 commands. First, we can use the ASSOC command to find the current file association for a particular file extension. And then knowing the file association we can feed it into another command FTYPE to display which application handles this association. E.g. let's try to find out which application will open PDF files:

assoc | find "pdf"
ftype AcroExch.Document.2015




Find all CAs (certificate authorities) in your organisation (Active Directory)


certutil has rich functionality. Here I am going to show you how to find all certificate authorities in your organisation:

certutil -ADCA | find "dNSHostName"


Get a list of all domain controllers


nltest is another powerful utility with lots of useful options. E.g. this is how you can quickly find a list of all domain controllers:

nltest /DCLIST:YourOrgDomainName


How to wipe deleted data using the cipher utility?


I will quote this Microsoft KB article:

When you delete files or folders, the data is not initially removed from the hard disk. Instead, the space on the disk that was occupied by the deleted data is "deallocated." After it is deallocated, the space is available for use when new data is written to the disk. Until the space is overwritten, you can recover the deleted data by using a low-level disk editor or data-recovery software.
The built-in cipher utility can be used to wipe data from the deallocated space making it (almost) impossible to recover. Impossible in the general sense. But I said "almost" - because there are special forensic solutions that can potentially recover data that was overwritten even several times.

In order to clean C: drive, first of all quit all programs. Then run this command against any directory on the target (C:) drive - it doesn't matter which directory you choose. Note, it may take a significant amount of time to wipe large disks.

cipher /w:c:\test

Microsoft uses a multipass approach when overwriting data:

Microsoft’s cipher.exe, writes a pass of zeros, a pass of FFs, and a pass of random data, in compliance with DoD standard 5220.22-M. (US DoD, 1995)

I will stop here. Please let me know if you find posts like this one useful and/or informative. As usual, leave your feedback, comments, command examples etc at the bottom.




Wednesday, February 8, 2017

The best new space exploration movies

Introduction

The reason why I decided to write this blog post is because I have watched the "Hidden Figures" movie. I was so impressed that I decided to spread the word. For those who know me it won't come as a surprise that I love movies about space. The university I graduated from was preparing specialists for the space industry. I have carried this passion for space exploration throughout my whole life. Combine this with the applied maths degree and you can see why the "Hidden Figures" resonated so well. Trajectory calculations, computer programs, solving tricky mathematical tasks - was exactly what we were being prepared for. But, it's not all. Space exploration is one of the pinnacles of our achievements as humanity.

International Space Station
Image source: http://spaceflight.nasa.gov/gallery/images/shuttle/sts-132/html/s132e012208.html

The International Space Station is the most complex machine ever built. Such achievements cannot be built in vacuum (no pun intended). This is like a pyramid. In order to be able to do X, you need to have all necessary capabilities at level (X-1). And for complex areas like space exploration those pyramids are huge. They contain multiples capability layers from numerous industries when combined result in the fascinating and amazing world of space exploration. Countless institutes and universities worked on various projects contributing to this goal. Thousands of people have been involved directly or indirectly. Thanks to these often unknown heroes we've been able to conquer space, put people on the Moon and send probes to visit all planets in our Solar system. The Hidden Figures tells an amazing true story about the crucial contribution of 3 American women (who worked at NASA's Langley research center) to the success of the early space program.

Young people, who haven't figured out yet what they are going to do in life, should watch this movie - especially girls. STEM is not just for the boys. I would like to see more women selecting IT or Space Technology or any other technical discipline as their careers.

Inspired by this movie I decided to compile a list of new space movies. I have to keep it quite broad - some of them will be about the space travel, outer space or space adventures. Others will explore the psychological aspects of traveling in space or aliens visiting Earth. Perhaps you will find something on this list that would pique your interest. So let's call this list - The Best Space Movies 2016-2017

The best space movies


The book - currently #7(!) on Amazon


and the movie
"Hidden Figures"
PG - 2016 - Drama film/Comedy-drama - 2h 7m

IMDb: 7.9/10   Rotten tomatoes:94%

Let me start by saying - if you have a teenage daughter take her to the cinema to watch this movie! This is such an empowering story and these 3 ladies are great role models for the younger generation. I even think that this movie or book should be added to the school curriculum. Watching this movie may spark interest in the STEM subjects.

The three "hidden figures" - three American women - worked with the first computers in the 1960s. Their work (calculating trajectories) was extremely important for NASA's early spaceflight.

"Arrival"
PG - 2016 - Mystery/Science fiction film - 1h 58m

IMDb: 8.2/10   Rotten tomatoes:94%

I felt that this movie was like Inception - so deep that you need to watch it several times to understand the multiple intertwined layers to fully understand what the authors wanted to say.

I want to believe that we are not alone in the universe. But how will the first contact look like? Will we be able to understand each other? Will we fail and try to use the military force or will we let science build the bridge between the 2 worlds? 
"The Space Between Us"
PG-13 - 2017 - Fantasy/Science fiction film - 2h 1m

IMDb: 6.1/10   Rotten tomatoes:18%

18% rating from the Rotten Tomatoes or 86% from Google users - who would you trust more? I think the answer depends on the audience. This is a romantic story combined with science fiction. This film won't win any Oscars. Teenagers will like it. More mature audience would probably enjoy other movies from this list more. Abstracting from the cheesy stuff, I like the idea of a first child born on a different planet (Mars). The boy grows up being able to interact with a very small number of people - knowing that there is the parent planet full of people. And inevitably there will be interesting challenges when a teenager boy finally gets to Earth.
"Passengers"
PG-13 - 2016 - Fantasy/Science fiction film - 1h 56m

IMDb: 7.1/10   Rotten tomatoes:31%

I am puzzled by the low-ish 31% rating from the Rotten Tomatoes. Their audience score of 67% combined with the 7.1 rating from IMDB I think are a better reflection of this movie's quality. Personally I liked it. I love interstellar, space exploration movies. This day will come when our children will launch spaceships to cross the vast emptiness to reach other stars. It was interesting to see some of the engineering aspects of the space flight. Space debris/meteoroid defense, self healing spaceship ability, medical pods... The only reasonable way to spend years in space is hibernation. But what will happen if you wake up way too early? Alone...
"Rogue One: A Star Wars Story"
2016 - Science fiction film/Action - 2h 13m

IMDb: 8.1/10   Rotten tomatoes:85%

Judging by the high ratings most people liked it. I must admit - I am not a big Star Wars fan. The first 3 films were great back in the days. But then Stars Wars have become demonstrations of what can be achieved with the latest computer graphics advancements. The Rogue One was not different in my view. It's a prequel to the first 3 movies and it explains certain parts of the overall story. It's fascinating to see how authors/script writers use their imagination and knowledge of the Star Wars universe to connect the new characters and story lines with the existing ones from the previous movies. It's an eye candy for sure - amazing graphics. If you are a Star Wars fan, you would certainly want to have it in your collection. If you are like me, then watching it once would be the right thing to do.


There are also 2 other movies that haven't been released yet, which I would like to share with you. I hope both of them will be exciting to watch.


Courtesy IMDB

"Life" - will be released on the 24th of March 2017
2017 - Fantasy/Science fiction film

IMDb:   Rotten tomatoes

First evidence of extraterrestrial life on Mars... Research like this is happening right now in the real world. Remember the "follow the water" mantra? Mars orbiters have spotted traces of methane gas. Methane is unstable - it cannot be there for a long time. It needs to be replenished, but where does it come from? Methane can have both geological and biological origins.

Courtesy NASA https://www.nasa.gov/jpl/msl/pia19088/

Mars rovers with each generation carry more and more tools including the ones to perform wet chemistry science. They are looking for organic matter, complex molecules and perhaps even simple microbial life.

I am excited about the movie that explores this one of the most important space exploration tasks of the 21st century.


Courtesy IMDB

"Valerian and the City of a Thousand Planets" - will be released on the 21st of July 2017
2017 - Science fiction film/Action - 2h 9m

IMDb:   Rotten tomatoes

28th century... Time traveling agent... Galactic empire... Based on the French comic series - it's got all the right ingredients to be a great movie. Let's wait and see!


Did you like this list? Have I missed any of the recently released or upcoming movies? Please leave your comments below. Happy watching ;)

Thursday, February 2, 2017

Useful Windows Command Line Tricks



Image courtesy https://blogs.msdn.microsoft.com/commandline/
I needed to run a couple of non-trivial commands from the command line recently. And this brought back some memories when I had opportunities to be more hands-on. Like any industry, IT has its own tricks of trade. As we got accustomed to GUI (more on the Windows side than *nix) the art of using command line was getting less relevant and less utilised. Having said that, if you manage larger environments than automation is a must and executing scripts and various commands at scale becomes a necessary skill. In this blog post I decided to share a few interesting and useful commands that I've learnt over the years managing Windows environments. The Internet is full of "Top X cool commands every administrator should know" articles containing some fairly basic recommendations. There is no need to repeat this. I would like to share a few less trivial commands that might make it easier for you to perform certain tasks.

Display Wireless network password in clear text


netsh wlan show profile name=MyWiFiNetwork key=clear


The key=clear parameter gives us an ability to extract a WiFi password from any WiFi network (profile) stored on your computer.

Extract a list of Domain Admin users in the organisation


net group "Domain Admins" /Domain


By default the "Authenticated Users" group has Read access - any authenticated user in the organisation can execute this command to identify which users belong to which group. In the example about I used the Domains Admins group. This type of information is useful for the attackers - it gives them the "juicy targets" - which users to target (phishing, brute forcing etc) to get the domain admin privileges.

Get a list of all users in the domain


net user /Domain


Gives you a long list of all user accounts in the domains. Again, might be useful for the attackers - gives them another piece of a puzzle.

Get computer's IP address


ipconfig|find "IPv4"


This can be done in multiple different ways. Here I wanted to demonstrate the "piping" trick, where a vertical pipe character is used to combine 2 commands. And the trick is that the (standard) output of the first command is used ("piped into") by the second command. In our case the "ipconfig" command displays a lot of information but we use the "find" command to only display lines containing the "IPv4" - this gives us an IP v4 address of the computer.

In addition we can use another trick and push this information straight into the clipboard by piping the output of the "find" command into "clip"

ipconfig|find "IPv4"|clip



Display useful Wireless Network Connection information (WLAN)


netsh wlan show interfaces



netsh is a VERY powerful and useful command. Here we are using it to display information about all existing wireless network interfaces on our computer. This information is very handy when troubleshooting various network related issues.

We can also extract information about the wired interfaces - just replace "wlan" with "lan" in the command: netsh lan show interfaces


Display WiFi SSID


netsh wlan show interfaces|findstr "[^B]SSID"


It's great when commands like the one above dump a lot of useful information. But sometimes you just need this one piece - especially if you are running a batch file and want to identify a specific value. The previous example shows lots of different things including the SSID (wireless network name). If we just need to extract the SSID we can pipe the output into the "findstr" command. I decided to use "findstr" instead of a simpler "find" because it supports regular expressions. The first command displays both SSID and BSSID and I wanted to remove BSSID from the final result.

Get a MAC address

The netsh command that we used above to show interfaces' info can also be used to get the MAC addresses for each interface (disguised as a "Physical Address" in the output). But there is also a simpler command to do this:

getmac


It will display MAC addresses of all network interfaces that are present in the system.

Display system information

The "systeminfo" command contains tons of useful operating system configuration details. Run it without specifying any parameters first to see the variety of data it can provide you. Sometimes it might be beneficial to store all of that information in a file (e.g. to be imported into the centralised repository later on). For that purpose I would recommend changing the output to the CSV format. This will make import much easier:

systeminfo /FO CSV > c:\temp\sysinfo.csv


Using environment variables

Environment variables have been around since the MS DOS days. Just run the SET command to display them all in the console window. Each environment variable can be referenced by its name surrounded by the percent symbols.

See how each variable can be referenced in any other command:

echo %OS%
echo %PROCESSOR_ARCHITECTURE%



My only advice is try using environment variables everywhere you can instead of hard-coding certain values in your scripts.

Energy report (Officially: Power Efficiency Diagnostics Report)


powercfg energy -output c:\temp\energy-report.html


This is probably one of a less known commands. If you have never seen a report produced by this command - give it a go a see what kind of information it can give you. It is incredibly useful for troubleshooting any power, sleep, hibernation related issues.




On-Screen Keyboard


osk


As simple as that. It will bring a virtual keyboard on the screen - just in case you want to type with your mouse ;)


Bring up a User Accounts dialog


control userpasswords2


The new user accounts dialog window looks too fancy and less convenient to me (btw, you can access it via "control userpasswords"). But if you prefer the old style dialog then it's still there. You can bring it up by running "control userpasswords2" - even on Windows 10.

User, Group and Privileges Information


whoami /all


Without the "/all" switch whomai just returns the current logged in user name. With the addition of the "/all" switch you can see a lot more useful information including all groups this account is the member of (including UUIDs) and all privileges assigned to this account (things like SeIncreaseQuotaPrivilege, SeSystemtimePrivilege etc)




WMI

Now let's explore the power of WMI. WMI is an incredibly powerful way of interrogating various system parameters. I want to share a few useful examples with you just to demonstrate what's possible. We will use the wmic utility that comes standard on every version of Windows that was released after Windows XP.

Get motherboard manufacturer 

We will extract this information from the win32_baseboard WMI class. To make it more interesting I will add a few additional command line techniques that you might find useful:

for /f "tokens=9 delims= " %F in ('wmic baseboard^|more +1') do @echo %~F




Here we are extracting the 9th token (tokens in our case are space separated), which happens to be the motherboard manufacturer. Note: If a value contains spaces then they are treated as separate tokens by this method.

Using the FOR command to split a string into tokens is a generic way of handling strings from the command line.

I also wanted to demo the "more +n" trick. "more +1" means "skip the 1st line". The output consists of 2 rows - the table header and the row containing the actual values. We need this to skip the 1st (header) line in the output.

There is a more elegant way to extract values in wmic. And I will demonstrate it in the next example.

Get physical memory size


wmic computersystem get TotalPhysicalMemory | more +1




This gives us total physical memory installed in our system in bytes (we have 16GB in the example above).

We can also get max memory capacity (commit charge)


wmic memphysical get MaxCapacity | more +1



We see that we have roughly 32GB of RAM available - this includes the 16GB of physical RAM plus the size of the swap file.

Get a list of all applications that run automatically when a user logs into the system


wmic startup


Get version of the Adobe Acrobat Reader installed on your computer


wmic product get name,version | find "Adobe Acrobat Reader"








I hope you were able found a couple of useful commands. What are your favourite commands? Please share them in the comments section below.

Keywords: windows command line, command line tricks, useful commands, wmic, devops, sysadmin, systems engineering, microsoft windows

Wednesday, January 25, 2017

Tesla and the future of the autonomous cars

I have been following companies like Tesla and SpaceX for quite some time. Elon Musk has an amazing ability (or sense) to identify which established mature industries are ready for some shake up. It is fascinating to watch how companies like Tesla with no prior experience come and disrupt their corresponding industries. Being "out-innovated" should be one of the biggest fears and yet many large organisations (both in the car manufacturing and space technology industries) didn't get that sense of urgency. They were too slow. This could be due to various reasons - organisational culture that doesn't support innovation, the cozy feeling that you are safe because there are no new players to change the status quo or the sense that the entry barrier is too high. 

What makes Elon Musk's approach unique I think is partially due to his software industry background. The "release early, release often" mantra, that is so common in the modern software development world, allows his teams to iterate at a higher pace (compared to other players), receive valuable feedback and close the feedback loop by releasing newer versions faster than their competitors. It might be OK to release new products/models once a year in the classic auto manufacturing scenarios (with major overhauls every ~4 years) but in the software development world that would be a crime. And we can certainly see the software industry influence in Tesla's approach. The pace of innovation at Tesla is just crazy. They "will never stop innovating".

Earlier this month Elon Musk announced that the new revision for Autopilot HW2 will be rolled out to the first 1000 Tesla cars. All other Teslas that support this update were promised to also receive it but it would run in the shadow mode. This means that it won't be actively controlling the car but it will be collecting stats on the background (what it would have done if it were enabled).



As you can see from the 2nd tweet, the plan was to enable this mode for all cars by the end of this week. With just a minor delay we received this update:



So just after a couple of weeks of testing all Tesla cars with the HW2 package started receiving the update. By running the system in shadow mode on the first test group of cars, Tesla engineers certainly were able to collect valuable real life data that helped them to calibrate sensors. But as you can see from the last tweet, some cars won't be able to complete the camera calibration process and will have to visit a service station for the "adjustment of camera pitch angle". In parallel (according to Musk), engineers are working on a software solution to this issue. Preparing a software patch to mask/compensate for a hardware problem is a very common approach in the IT industry - this is another indicator of what is in Tesla's DNA.

And now this tweet:

As more data arrives, Tesla can keep tweaking and adjusting the capabilities of the new system. This is a spiral. Releasing updates in relatively short 2 to 6 weeks iterations is very similar to the agile approach, that has become a common practice in IT. It delivers increased value to the customers (end users) sooner.

Late last year Elon Musk has announced an ambitious goal for 2017 - a Tesla car (fully autonomously - no touch) should be able to drive from LA to New York, drop the driver (passenger? occupant?) off at Times Square and then automatically drive away to find a parking spot. We live in the future!

As a side note - sometimes it feels like Elon makes a public announcement first and then forces his employees to stick to these commitments. It's a nice way to maintain the high pace.

Another important news this week came from the National Highway Traffic Safety Administration (NHTSA). 

In May last year a 2015 Tesla Model S operated in the Autopilot mode collided with a tractor trailer in Florida. In June 2016 NHTSA opened an investigation PE 16-007 to "examine the design and performance of any automated driving systems in use at the time of the crash." They have finally released a report after completing the investigation of this crash.


 The result of the investigation?

NHTSA’s examination did not identify any defects in the design or performance of the AEB or Autopilot systems of the subject vehicles nor any incidents in which the systems did not perform as designed.
This is great news for Tesla. But, this is not all. This report also highlighted the safety benefits of Tesla’s Autosteer features. Automatic Emergency Braking (AEB) technologies include the following: Forward Collision Warning (FCW), Dynamic Brake Support (DBS), and Crash Imminent Braking (CIB). 

In 2016 20 car manufacturers (99% of the US new car market) made a voluntary commitment to make AEB "standard on virtually all light-duty cars and trucks with a gross vehicle weight of 8,500 lbs. or less no later than September 1, 2022"

IIHS research shows that AEB systems meeting the commitment would reduce rear-end crashes by 40 percent. IIHS estimates that by 2025 – the earliest NHTSA believes it could realistically implement a regulatory requirement for AEB – the commitment will prevent 28,000 crashes and 12,000 injuries.
12 thousand injuries (!!!) with some of them inevitably being fatal. Imagine how many people will make it safe back home to their loved ones - thanks to this amazing technology. A 40% reduction in rear-end collisions alone is a pretty big deal!

Human driver typical reaction time ranges from 0.7 sec to 3 sec with 1.5-2.3 being the average. 0.7 seconds is an eternity in the world of computers. Remember when automatic gearboxes were first introduced? They were slow and clunky. Purists were advocating for the manual gearboxes - better control of a car, faster launch times (quartermile) etc. Technology kept evolving and then suddenly one day we saw models equipped with the auto gearboxes showing faster times. We just have to accept the fact that computers are faster. The same is happening with the Autopilot technologies. 0.7 seconds is plenty of time for a computer system.

Image courtesy of https://www.technologyreview.com/s/534981/car-to-car-communication/


Now combine this with the car-to-car communications and capability gap becomes even wider. Cars will be able to update each other and provide situational awareness. A car driving 2 cars in front of us in the same lane sees an obstacle and initiates emergency braking. A human driver wouldn't be even aware of this emerging situation yet. Compare this with our autonomous car that immediately receives this information and responds by priming the brakes, reducing engine power and preparing to slow down and stop (actively measuring the distance to the car in front of us).

These recent successes and overall progress in the world of self driving cars got me pondering - how will these technologies change the world around us in the near future?

Faster reaction time means that autonomous cars can travel at faster speeds and potentially closer to each other (to increase the flow rates). This will most likely result in dedicated lanes for autonomous cars only (human drivers will not be allowed to drive there). These lanes might also be equipped with additional features to aid navigation and safety.

Since autonomous cars will be safer, the insurance premiums for the human drivers will go up. Do you want to enjoy driving? You will need to pay more for this "luxury". Imagine telling your grand kids stories how you were driving cars yourself - manually - and that those cars were burning toxic flammable liquids in their engines. 

Companies like Uber will introduce autonomous taxis. Unlike humans that need to rest (or work multiple shifts), new taxis will be able to work 24/7. Eliminating the need to pay human driver salary will drive the cost of this service down (compared to standard taxis).

The idea of car ownership will change too. On a typical business day I need a car 1 hour in the morning to get me to work and 1 hour in the evening to bring me back home. It might be tempting to become a passenger in the autonomous car and enjoy reading or browsing Internet while this car takes you to/from work. 

After dropping you off at work:
If it is a taxi it will drive to service the next customer. 
If it is your own car - it will either automatically find parking (to wait till you need it again in the evening) or it will join the fleet of taxis for a few hours while you work to earn some money for you.

Buying a new car will be fun! You will go to your favourite car sales web site, search for a model you like, complete the transaction.... and the newly purchased car will automatically arrive to your doorstep. "Beep, beep, I am here!"

Private car sales might change too. It might be possible to complete the transaction online (most likely via some sort of escrow service for added safety). But there will be no need to find time to meet face to face for inspection. The car will automatically drive itself to the mechanic of your choice. You will receive an inspection report online. After completing the transaction the car will drive to your home address by itself.

Parking will look different. It will most likely go underground to save space (no need to have huge surface parking lots anymore). It can become more dense too - there will be no need to open up doors and autonomous cars can be guided more precisely to park centimeters from each other (while eliminating chips and scratches)

The cars will automatically visit service stations for scheduled service or necessary repairs. They will  also be able to go to the car wash service and return back home. 

Crowd-sourced navigation services like Waze work well. But the true potential of this approach will be unlocked when connected cars begin talking to each other and to some soft of a central traffic planning and navigation centre. By knowing current locations, planned destinations, and current road and weather conditions it will be possible to select the best routes and optimise overall city traffic flows. 

Navigation planning and car-to-car communications will allow special services (police, ambulance, firefighters) to command all self-driving cars to slow down and move aside to free up a lane. It will also be possible to isolate a section of the road or an intersection for roadworks so that autonomous cars will find alternative routes.

We will see more and more electric cars on the roads. And electric cars are silent. Car enthusiasts (rev-heads) love the engine sounds (which might become extinct soon). That meaty, low pitch sound of a V8 engine... Sorry turbo fours and sixes - your sound just can't match it. Remember how popular were mobile phone ringtones? I predict that in the near future electric cars will come "equipped" with a few standard engine sounds (mainly for the enjoyment of the driver). And there will be an option to buy and download additional sound packs containing any imaginable engine sounds for a small fee.

As a security professional I can also see the new risks. For example, what if someone simulates an emergency breaking signal? It could just be a harmless prank - just to watch all autonomous cars coming to a screeching halt. Or it can have more nefarious reasons. The bad guys may trick a car stop or slow down in a given location to make it easier to attack it. There have been similar attacks demonstrated recently when a low tyre pressure signal was simulated/faked leading a driver to pull over and stop to check tyres.

Potentially someone could try to gain an unfair advantage by transmitting wrong data - e.g. to free up a lane or get the priority when crossing an intersection.

But all these issues aside, I am excited about the future. And thanks to the grand vision of people like Elon Musk this future is becoming a new reality.

I would like to hear your feedback. What are your thoughts? Do you have a self driving already or are you planning to buy one soon? How do you think the world will change? Please leave your answers below.