Wednesday, November 21, 2018

L1 Terminal Fault Vulnerability (L1TF) aka Foreshadow

L1TF Introduction

L1 Terminal Fault (L1TF) is a side-channel vulnerability in the Intel CPUs. This is another speculative execution vulnerability similar to the other ones that have been identified and disclosed in recent months (remember Spectre ?). Later on this vulnerability has been dubbed as Foreshadow.

Modern CPUs have up to 3 levels of cache. L1 cache is the smallest/fastest of them. Each CPU cache has its own L1 cache (while L3 is a larger one, shared by all cores, which leads to other issues and vulnerabilities classes). 

The key reason why such vulnerabilities exist is the speculative (or out of order) code execution. My coffee shop at work has this vulnerability. When a barista sees me in the morning she starts making a small latte BEFORE I even have an opportunity to place and order and pay. They know me so well that they PREDICT that I will be ordering a small latte. They are usually right, so it helps with the overall speed of delivery, which makes me a happy customer. If one day I decide to "troll" them and order something else they will have to discard the cup of small latte and start preparing a new order from scratch.

in the L1TF case the issue is caused by "over-optimization" in the CPU internal logic, when a virtual address translation happens in parallel with cache access to the L1 cache. I highlighted "in parallel" because while one process within CPU still tries to retrieve/figure out the bits related to the present/not present status of a particular Page Table Entry (PTE), the other process "hopes for the best" and assumes that the data/bytes that we are trying to read from that Page do already exist in cache. There are 2 different outcomes. In one case, the page is actually present in memory and the L1 cache contains the same value. This certainly helps with the overall performance since speculative execution has already used this value and moved execution forward. But in another case the needed page will not be in memory (e.g. swapped out to disk). In this case, a "terminal fault" condition will arise (hence the name for this class of vulnerabilities). Once the terminal fault/page not present condition happens the other process (that accessed data from L1 cache) has already progressed and accessed/used data values related to that memory page. An attacker can access/read data from physical addresses if a "not present" page table entry can be created for the addresses the attacker is interested in and if these addresses are present in the L1 cache.

By exploiting this type of vulnerability an attacker can extract various secrets stored in memory - passwords, crypto keys etc - i.e. being able to read privileged data across trust boundaries

Here is the original Intel article that describes the L1TF vulnerability:

L1TF vulnerability affects several generations of the Intel CPUs and has 3 CVEs associated with it with the first one carrying a high risk score of 7.3:

CVE-2018-3615 - L1 Terminal Fault: SGX
CVE-2018-3620 - L1 Terminal Fault: OS/SMM
CVE-2018-3646 - L1 Terminal Fault: VMM

Vulnerabilities of this kind affect multiple different companies, so it becomes hard to keep track of various advisories issued by these companies.

In this article I decided to collate knowledge base articles and remediation steps published by various affected vendors - all on one page.

Here is a nice "Spectre Meltdown checker" shell script that can check the status of various vulnerabilities in this family and the mitigation status:

3rd party advisories and mitigation guidance

Cloud hosting providers

Microsoft Azure
Google Cloud GCP
Digital Ocean

OEM, hardware vendors, software companies

Microsoft Server and Hyper-V
Considerations for OpenStack
VMWare and
VMWare performance impact

Now that we've covered the L1TF vulnerabilities, I have to mention that security research doesn't stop there. Just recently there were 7 (!) new Spectre and Meltdown-like variants published in this article. So there are new waves of patches coming our way ;)

Tuesday, October 9, 2018

My first months in America - Part 3 - automotive edition

I slowed down my blogging activity recently but a few of my ex-colleagues and friends in Melbourne asked me to continue writing about my US experience. So here we go - here is a 3rd blog post about some interesting things I came across here in the US. This is going to be an automotive edition.

I've learnt that if you want to buy a car there are companies that would bring this car to you. From what I've seen, it is more common in a luxury/more expensive price segment. A potential buyer can just express interest, agree on a place and time to meet, and someone will bring this car there for you to perform a test drive. That's pretty cool, especially for busy people when they don't have too much time driving around dealerships around the whole Bay Area.

Continuing on the cars topic - there are 3 passenger car models that are very popular here. In addition to Toyota Camry and Corolla I was surprised to see A LOT more Honda Civic cars compared to Australia. And while Toyota Camry was the best selling car in America in 2016 and 2017, Civic numbers look strong too. In fact, all 3 most popular models showing 30K+ sales figures in March 2018, with Honda Accord not too far away with 24K cars sold in the 5th place. This is a passenger cars segment (measured in units - I have a surprise later in this article in regards to another metric "by revenue"). But what I've discovered is that Americans love their trucks. A (pickup) truck here is not a "large, heavy motor vehicle used to transfer goods". Here it is more akin to a ute but bigger and more brutal/manly. Ford with their F-Series is a clear winner with 73K+ sold in April.

Ford F-150, 2015 model year

There is also a lot of hybrid and fully electric cars (at least here in California). In fact, I came to a conclusion that the Bay Area lives in the future - about 5 years ahead of the rest of the world. Every morning I drive to work passing Fremont (where Teslas are made) and I see trucks loaded with Tesla model 3s taking them all over the country. There were already many Tesla Model S on the road 2 years ago. But since Elon fixed the production issues for Model 3 it's just incredible how many of those "baby Teslas" I see now. Just recently I was chatting with a colleague, who came from LA and he said that they didn't have that many Teslas over there. Another fully electric car that gets very popular is Chevy Bolt EV (not the Volt, which is a hybrid - the naming is quite confusing). 

And speaking about Model 3 - new numbers released for August 2018 indicate that Tesla has become the best selling passenger car in the US (by revenue)

Top US selling cars by revenue.
Image source:

The car charging infrastructure is very well developed too. We have several places on the parking lots equipped with the charging stations. The owners of the electric cars have an internal Slack channel, where they maintain the line to make sure everyone gets a chance to charge their cars. Usually they agree on 2 hour time blocks.

Hybrid cars are very popular here. There are 2 types - one that chargers the battery while coasting or breaking and another (called plug-in hybrid) that has an ability to charge a battery via a charge point station.

The popularity of hybrids is explained by both California being super "green" (ecology topics are huge here) and purely because hybrids are cheaper to run (achieving fuel consumption of 36 MPG and better). Here I need to explain what MPG is: car fuel consumption is measured in miles per gallon (MPG) - a unit not only meaningless for the rest of the world but also being an inverse function (the larger the value - the better)! If litres per 100km was very natural and easy for me to understand, the amount of miles I can drive on a gallon of petrol (it's called gas here) is harder to "feel".

I haven't seen weekly petrol price fluctuations. But petrol prices in San Francisco can be a dollar higher (per gallon) compared to San Jose. To give you an idea, at the time of the writing, petrol prices in SF are around $4. 

I have a soft spot for American muscle cars. They look great, they are powerful (especially in the straight line performance) and they are not that expensive. I see a lot of Mustangs (my favourite), Chargers etc on the road and this brings smiles to my face.

Another novelty was the "Spare the air" campaign. Basically it's an organisation that monitors air quality and issues spare the air alerts encouraging motorists to leave their cars at home and use public transport instead of carpool (when each car has to have more than one occupant).

I'd like to say a few words about the car search web sites. There are generic classifieds web sites (like autotrader,, CarMax, TrueCar, various dealers' sites etc). They are doing their job, it's a great starting point when you start exploring some options. I see some innovation too. But having worked for Carsales in Australia (which obviously makes me biased) I have to say that Carsales' search is the best I've seen so far. The richness of the search interface is just amazing compared to some other sites that I've used recently.

Craigslist is quite popular too - especially in the cheap used cars segment. But that interface... seriously?!

Thank you for reading! As always, please leave your questions and comments below.

Tuesday, May 23, 2017

Cloud computing, agile software development and DevOps are a holy trinity

Agile methodologies, DevOps and Cloud technologies are sometimes referred to as a holy trinity of the modern software development world. One of the promises of the Agile approach was to make software release cycles shorter. Gone are the days where software firms could have major releases every 2 years (and perhaps one or two service packs in between). Competitive pressure requires a much higher delivery frequency. Product owners started adopting hypothesis-driven approaches to quickly validate various assumptions and close feedback loops.

To support fast-pace environments, organizations needed to reduce friction between different teams as much as possible. Around 2009 DevOps was born. It is a way of thinking. It is in a way a philosophy. It brings different teams together in a cohesive unit focusing on end-to-end delivery. By eliminating human elements and automating as much as possible DevOps has changed the focus from traditional questions of stability and efficiency towards agility and unlocking teams’ innovation potential.

To be successful in innovation, organizations have to rely on creative thinking and engineering talent of their staff as the key inputs. Humans are good at thinking. Engineers enjoy solving new problems. All mundane, repetitive tasks can and should be automated. These tasks are still a necessary part of the overall software development cycle but it has become obvious that human talent is wasted performing these tasks. Ask QA engineers how they feel about performing manual regression tests. Automation of the repetitive tasks is one of the DevOps deliverables. This way organizations gain pace, reduce the risks of human errors and allow their engineers to focus on the important things.

A/B testing, hypothesis validation, faster feedback loops… They all require quick provisioning and elasticity. In the traditional IT world, it could take weeks to provision a new server. Software delivery teams couldn’t and didn’t want to wait that long. And, what if we need this server only for a couple of weeks to quickly validate an idea?!

This is how cloud was born. Public cloud offerings are becoming mainstream. More and more companies find it tempting to rent computer resources (OPEX) instead of investing in their own hardware and data centers.

A public cloud model is perfect for a start-up. More mature business with an existing hardware footprint quite often in order to protect their existing investment choose the hybrid model. This model offers the best of both worlds. Legacy applications will continue running in the existing datacenters (private cloud) with the ability to overflow into the public cloud. The public cloud’s elasticity can provide immediate access to the required resources where it makes sense.

There might be several reasons why a company might decide to go down this path. It could be a hard to maintain a legacy application that would be nearly impossible to migrate. Or they might be unwilling to move sensitive data or mission-critical applications as the risk of doing so might outweigh the perceived benefits of the public cloud. Private cloud or datacenter infrastructure can continue running static workloads (the ones that don’t require the adaptive capabilities of the public cloud), providing additional security benefits, or meeting the regulation requirements, where sensitive data cannot be stored in certain geographies.

Modern software development practices force us to break legacy monoliths into the independent microservices. This is where some of the newly written code can be easily moved to the cloud.

It is a mistake to think that DevOps is really just "IT for the Cloud". DevOps can really shine for companies considering the hybrid cloud approach.

What are the key DevOps considerations when running in a hybrid cloud environment?

Define your technology stack and stick to it

Avoid creating a "zoo of different technologies". I often use this term to describe a situation when different teams select similar but different technologies (e.g. MSMQ, SQS, Redis as queuing mechanisms) that provide roughly the same functionality. Identify key “building blocks” your application(s) consist of and socialize this fact with everyone in the team. For example, if a front-end team decided to use Angular then this should become your weapon of choice for some time. New javascript frameworks get released every 6 months. It will be a mistake to chase “the latest and greatest” flavor of the day.

The "sticking to it" part doesn’t mean stagnation. But every proposed change should be justified – what will the new technology give us compared to what we already have. If you decide to go ahead, then aim to replace existing technology, not to run the new one in parallel. A short overlapping period is fine but running 2 similar technologies for a long time is what creates a hard to support "zoo" and eventually results in increased technical debt.

Seek feature parity between the on-prem and cloud-based deployments

When running in hybrid mode it might be tempting to choose cloud-native technologies in the cloud part of your infrastructure – for example technologies like SQS for simple queuing. But in most cases it would be a mistake. You will break the parity principle, meaning that now you must support two different technologies. Two parts won’t look the same, you would have to deploy them differently and testing would most likely be different too. Consequently, it is better to select technologies that can be deployed into both parts of the hybrid solution.

Abstract the underlying components and infrastructure from your application

If the temptation to use cloud-native technologies is still too high, then at least make an effort to architecturally abstract these pieces into a separate layer. Avoid making direct calls to these components from all around the application. Instead, implement a thin abstraction layer as a single entry point. Provide this functionality as an easy to consume and integrate library/module. By doing this you will make it easier to upgrade and replace the underlying technology later on. This will also reduce your reliance on a particular cloud provider (“vendor lock-in”) making your overall solution more portable.

Avoid manual changes at all costs

Consistency is key to operate large/complex fast pace environments at scale. You can’t afford having unique snowflakes in your environment. Amazon’s "cattle, not pets" mantra means all of our individual assets should look the same. Any changes to the setup or configuration should be via the orchestration layer. When troubleshooting production issues, it might sometimes be tempting to login to a server and quickly make changes manually. This is driven by a false sense of convenience. If making changes automatically sounds slow or inconvenient then it’s a sign that your automation/orchestration toolset needs to be improved. This leads us to the last point.

Select the right automation platform

It is important to select the right tools to automate and manage your infrastructure. These tools should be able to control both the on-prem and the cloud parts providing a seamless, "single pane of glass" view of your infrastructure. They should be able to speak the "native language" of the cloud platform and provide the necessary level of convenience for the DevOps team to use them as the primary (and only!) way of maintaining the environment.

Keywords: Agile, DevOps, Cloud, "hybrid cloud", "public cloud"

Sunday, April 9, 2017

The mysterious "J" instead of a smiley face

I am sure many of you received emails with a "mysterious" capital "J" character in the sentences like "Thank you J". And I am sure many of you know the answer. But just in case you are still wondering what this means - here is the answer.

You may have figured out from the context that "J" stands for a smiley face. And you will be right. But why "J"? After all it doesn't really resemble a smiley face.

What does J mean in the email messages?

I will use a short email that I received earlier this week as an example:

Looks absolutely fine in my Outlook 2016. But in a different mail client the smiley face is replaced with a capital "J". Let's look at the actual HTML code of this email:

<body bgcolor=white lang=EN-US link="#0563C1" vlink="#954F72"><div class=WordSection1><p class=MsoNormal>Team effort! Strength in numbers <span style='font-family:Wingdings'>J</span> <o:p></o:p></p><p class=MsoNormal><o:p>&nbsp;</o:p></p>
Interesting! Where Outlook renders a smiley face we actually have a span tag, that switches to the Wingdings font. And that span contains a single capital J character.

<span style='font-family:Wingdings'>J</span>
To understand what's actually happening here we need to launch a Character Map application built into every version of Windows.

Let's select the Wingdings font there and type a capital "J"

Wingdings font - capital "J"

See the smiley face there? Now it all makes sense. It's Outlook's way of embedding a smiley face into the email body. I've seen this issue for quite some time, so this decision was made most likely before a wider adoption of the Unicode. Those email clients that support both HTML and the Wingdings font will render a smiley face properly. Sometimes HTML tags might be stripped out, encoding may change from server to server, Wingdings may not be supported etc - in those cases the end result will be a capital "J".

Note - this is not the same smiley face that is corresponding to a Unicode character U+263A (decimal 9786). This one looks like this: ☺

Unicode character U+263A - smiley face

So, now you know what this mysterious "J" means and where it comes from. Hope you've enjoyed this quick investigation. 

Monday, February 20, 2017

Australian Banks Security (HTTP headers edition) - Feb 2017

Back in 2015 I wrote 2 blog posts where I examined the security posture of the major Australian banks. I have only focused on two aspects - HTTP security headers (the presence or absence of particular headers) and the login forms (password lengths, autocomplete etc). On one hand, this is not an in-depth research and it is certainly not a vulnerability assessment that I am sure all these banks regularly go through. On the other hand, it is a great indication if bank's development and security teams follow modern security practices and put enough effort into their online security. This may serve as an indirect indication of the overall security state of affairs in a given organisation.

I was curious to see if there were any changes (for better or for worse) during these last 2 years. HTTP security headers have really become mainstream and I expected the adoption rates to be higher.


Image source:

Scott Helme continued to evolve his great Security Headers web site that I used during my previous analysis. Similar to the Qualys SSL Server Test tool, he has added an overall rating, which I will add as a new column. Another nice addition is a new check for the Referrer Policy headers. If you haven't done it yet, make sure you go to Scott's site to check HTTP headers emitted by your web site. Let me know if you need any help understanding or addressing any of the highlighted issues.

Let's see what Australian banks do in regards to HTTP security headers in February 2017


SecurityHeaders rating

C Yes No No Yes, DENY Yes Yes No No No
Bank West 4

C Yes Yes No Yes, SAMEORIGIN Yes No No Yes No
Beyond 3

E Yes No No Yes, SAMEORIGIN No No No No No
ING Direct 3

E Yes No No Yes, SAMEORIGIN No No No No No
St George 3

E Yes No No Yes, SAMEORIGIN No No No No No
Bendigo Bank 2

E No No No Yes, SAMEORIGIN No Yes No No No
Teachers Mutual 2

E Yes No No No No No No No No
CUA 1.5 E Yes No No No No No Yes, CUA Server No No
Commonwealth Bank 1 E Yes No No Present but incorrect syntax ALLOW-FROM No No Yes,
Apache/2.4.6 (Red Hat) OpenSSL 1.0.1e-fips
No No
Newcastle Permanent 1

F No No No Yes, SAMEORIGIN No No No No No
People's Choice Credit Union 1

F No No No Yes, SAMEORIGIN No No No No No
P&N 1

F No No No Yes, SAMEORIGIN No No No No No
Suncorp 1

F No No No Yes, SAMEORIGIN No No No No No
Westpac 1

F No No No Yes, SAMEORIGIN No No No No No
AMP 0.5 F No No No Yes, SAMEORIGIN No No Yes, IBM_HTTP_Server No No
ANZ 0.5

F No No No Yes, SAMEORIGIN No No Yes, Apache No No
Bankmecu -> BankAust 0 F No No No No No No No No No
Greater 0 F No No No No No No No No No
Heritage 0 F No No No No No No No No No
Macquarie 0 F No No No No No No No No No
Bank of Queensland -2 F No No No No No No No Yes, ASP.NET Yes, 2.0.50727

Key findings

  • Significant improvements over the last 2 years
    • Only 1 bank is in the negative territory (previously 7)
    • 7 banks have a score of 2 or above (previously only 1)
  • Better adoption of security headers (group 1) by the banks.
    • X-Frame-Options is the most popular header. 13 out of 21 banks (62%) have adopted it (previously only 4). I guess more security professionals recognise clickjacking being a real weakness.
    • Great to see 8 banks out of 21 (38%) using HSTS (previously only 2)
    • But not everyone who emits the HSTS header includes subdomains (includeSubDomains)
    • And even less number of banks use the "preload" directive (which is a required step for HSTS preloading) - only CBA
  • Content-Security-Policy is still not getting any traction. Only one bank - Bank West - has implemented CSP. CSP is a poweful defence-in-depth measure to prevent cross site scripting attacks, clickjacking and some other types of attacks.
  • The situation with the group 2 headers is even better. Many banks that were in the second half of the table lifted their game and removed these unnecessary headers. Only 6 banks out of 21 still need to fix this issue (previously 12).
  • There is still a long way to go.
    • No one uses public-key-pins
    • Only 2 banks serve the X-Xss-Protection header. This is the simplest and essentially zero risk header to implement!
    • Only 2 banks use the X-Content-Type-Options header (previously none). This is another extremely simple header to implement.
    • Understandably no one uses the Referrer Policy headers yet.

Additional comments

BankAust redirects from home page to a non-secure page. Why? Please fix this.

CBA made a mistake in X-Frame-Options ALLOW-FROM syntax. There is no need for the equal sign there.

Previous winner Bank West was the only bank that has managed to get a lower score. One point was deducted for the presence of the X-Powered-By header. It's a simple mistake to make. It usually "returns" after a .Net patch installation.

We have a new leader. Congratulations to the IMB bank. They made a massive jump (+5.5 points) fixing all of the issues and introducing many of the recommended HTTP security headers. Well done!

Sunday, February 12, 2017

Obscure Windows commands and features

My previous blog post has become quite popular crossing the 10,000 views mark in just a few days. Given such interest I decided to share a few more useful commands as well as some obscure tricks that I came across over the years.

hh.exe - HTML help

hh.exe has been part of Windows for a very long time. What makes it fun though is that it supports external URLs, so we can make calls like this:

hh http:\\    (notice the backslashes)

This made me curious - what is the user agent string for this "browser"? Running it on a Windows 7 machine:

hh http:\\

Let's just collectively exhale "Wow... IE7" ;)

But what about Windows 10 I hear you ask. There it is:

Same thing - IE7. Of course this is not the real IE7 - Trident/7.0 is a give away. This is IE 11 running in IE7 compatibility mode. But I still find it funny.

mshta.exe exhibits the same behavior except for the old style navigation bar. But I find the retro "IE6 style buttons" look way more amusing.

I mentioned another effect of HH in one of my earlier blog posts - even if you disable Adobe Flash player in your browsers, it will still be there (as demonstrated by opening a web page in HH).

"God mode"

You can create a directory with a specific GUID extension to enable the so-called God Mode. No, you won't get the BFG 9000. Instead, Windows will populate this directory with a LOT of tools and various management options - all in one spot. There will be nothing new that you haven't seen before - just an interesting way of presenting all of these tools in the same place.

To do this - just create a directory with this specific name ("GodMode" part can be anything but the GUID part is important)

md "GodMode.{ED7BA470-8E54-465E-825C-99712043E01C}

Yet another Recycling Bin?

In case one is not enough ;)

md YetAnotherRecycleBin.{645FF040-5081-101B-9F08-00AA002F954E}

Where-ever you create this directory, it will act as a Recycling Bin. It is the existing recycling bin - i.e. if you already had a few deleted files you would be able to see them in the "new" bin too.

View Reliability History

Buried deep inside, the reliability monitor allows you to have a quick look what happened to your system recently that may have a potential impact on system reliability.

To find it - just click Start -> Run -> and type "reliability"
Or you can choose a more convoluted path: go to Contol Panel -> System and Security -> Review your computer's status and resolve issues -> Maintenance

and then click on "View reliability history" (at the bottom of this picture)

This is probably something that you have never used before. And yet it certainly is a useful representation of what happened on this system recently.

Notepad Log

Did you know that notepad can append a date/time every time you open a document? In fact, this feature has been around for a long time but not that many people know about it.

Just create a new text document and put ".LOG" (without the quotation marks) at the top.

Now every time you open this file, you will see a new timestamp automatically added by the notepad.

Stored Credentials (aka Windows Vault)

Run this command:

control keymgr.dll

You will be able to see all currently stored credentials (and perhaps delete the unused or sensitive ones)

You can use a command line equivalent:

cmdkey /list

When was the computer rebooted last time?

Run this command:

net statistics server 

and check the "statistic since <date time>" line

Find which application will open files with a particular file extension

I know that usually people just search through the registry to get this information. But there is a simpler way to achieve this by using 2 commands. First, we can use the ASSOC command to find the current file association for a particular file extension. And then knowing the file association we can feed it into another command FTYPE to display which application handles this association. E.g. let's try to find out which application will open PDF files:

assoc | find "pdf"
ftype AcroExch.Document.2015

Find all CAs (certificate authorities) in your organisation (Active Directory)

certutil has rich functionality. Here I am going to show you how to find all certificate authorities in your organisation:

certutil -ADCA | find "dNSHostName"

Get a list of all domain controllers

nltest is another powerful utility with lots of useful options. E.g. this is how you can quickly find a list of all domain controllers:

nltest /DCLIST:YourOrgDomainName

How to wipe deleted data using the cipher utility?

I will quote this Microsoft KB article:

When you delete files or folders, the data is not initially removed from the hard disk. Instead, the space on the disk that was occupied by the deleted data is "deallocated." After it is deallocated, the space is available for use when new data is written to the disk. Until the space is overwritten, you can recover the deleted data by using a low-level disk editor or data-recovery software.
The built-in cipher utility can be used to wipe data from the deallocated space making it (almost) impossible to recover. Impossible in the general sense. But I said "almost" - because there are special forensic solutions that can potentially recover data that was overwritten even several times.

In order to clean C: drive, first of all quit all programs. Then run this command against any directory on the target (C:) drive - it doesn't matter which directory you choose. Note, it may take a significant amount of time to wipe large disks.

cipher /w:c:\test

Microsoft uses a multipass approach when overwriting data:

Microsoft’s cipher.exe, writes a pass of zeros, a pass of FFs, and a pass of random data, in compliance with DoD standard 5220.22-M. (US DoD, 1995)

I will stop here. Please let me know if you find posts like this one useful and/or informative. As usual, leave your feedback, comments, command examples etc at the bottom.

Wednesday, February 8, 2017

The best new space exploration movies


The reason why I decided to write this blog post is because I have watched the "Hidden Figures" movie. I was so impressed that I decided to spread the word. For those who know me it won't come as a surprise that I love movies about space. The university I graduated from was preparing specialists for the space industry. I have carried this passion for space exploration throughout my whole life. Combine this with the applied maths degree and you can see why the "Hidden Figures" resonated so well. Trajectory calculations, computer programs, solving tricky mathematical tasks - was exactly what we were being prepared for. But, it's not all. Space exploration is one of the pinnacles of our achievements as humanity.

International Space Station
Image source:

The International Space Station is the most complex machine ever built. Such achievements cannot be built in vacuum (no pun intended). This is like a pyramid. In order to be able to do X, you need to have all necessary capabilities at level (X-1). And for complex areas like space exploration those pyramids are huge. They contain multiples capability layers from numerous industries when combined result in the fascinating and amazing world of space exploration. Countless institutes and universities worked on various projects contributing to this goal. Thousands of people have been involved directly or indirectly. Thanks to these often unknown heroes we've been able to conquer space, put people on the Moon and send probes to visit all planets in our Solar system. The Hidden Figures tells an amazing true story about the crucial contribution of 3 American women (who worked at NASA's Langley research center) to the success of the early space program.

Young people, who haven't figured out yet what they are going to do in life, should watch this movie - especially girls. STEM is not just for the boys. I would like to see more women selecting IT or Space Technology or any other technical discipline as their careers.

Inspired by this movie I decided to compile a list of new space movies. I have to keep it quite broad - some of them will be about the space travel, outer space or space adventures. Others will explore the psychological aspects of traveling in space or aliens visiting Earth. Perhaps you will find something on this list that would pique your interest. So let's call this list - The Best Space Movies 2016-2017

The best space movies

The book - currently #7(!) on Amazon

and the movie
"Hidden Figures"
PG - 2016 - Drama film/Comedy-drama - 2h 7m

IMDb: 7.9/10   Rotten tomatoes:94%

Let me start by saying - if you have a teenage daughter take her to the cinema to watch this movie! This is such an empowering story and these 3 ladies are great role models for the younger generation. I even think that this movie or book should be added to the school curriculum. Watching this movie may spark interest in the STEM subjects.

The three "hidden figures" - three American women - worked with the first computers in the 1960s. Their work (calculating trajectories) was extremely important for NASA's early spaceflight.

PG - 2016 - Mystery/Science fiction film - 1h 58m

IMDb: 8.2/10   Rotten tomatoes:94%

I felt that this movie was like Inception - so deep that you need to watch it several times to understand the multiple intertwined layers to fully understand what the authors wanted to say.

I want to believe that we are not alone in the universe. But how will the first contact look like? Will we be able to understand each other? Will we fail and try to use the military force or will we let science build the bridge between the 2 worlds? 
"The Space Between Us"
PG-13 - 2017 - Fantasy/Science fiction film - 2h 1m

IMDb: 6.1/10   Rotten tomatoes:18%

18% rating from the Rotten Tomatoes or 86% from Google users - who would you trust more? I think the answer depends on the audience. This is a romantic story combined with science fiction. This film won't win any Oscars. Teenagers will like it. More mature audience would probably enjoy other movies from this list more. Abstracting from the cheesy stuff, I like the idea of a first child born on a different planet (Mars). The boy grows up being able to interact with a very small number of people - knowing that there is the parent planet full of people. And inevitably there will be interesting challenges when a teenager boy finally gets to Earth.
PG-13 - 2016 - Fantasy/Science fiction film - 1h 56m

IMDb: 7.1/10   Rotten tomatoes:31%

I am puzzled by the low-ish 31% rating from the Rotten Tomatoes. Their audience score of 67% combined with the 7.1 rating from IMDB I think are a better reflection of this movie's quality. Personally I liked it. I love interstellar, space exploration movies. This day will come when our children will launch spaceships to cross the vast emptiness to reach other stars. It was interesting to see some of the engineering aspects of the space flight. Space debris/meteoroid defense, self healing spaceship ability, medical pods... The only reasonable way to spend years in space is hibernation. But what will happen if you wake up way too early? Alone...
"Rogue One: A Star Wars Story"
2016 - Science fiction film/Action - 2h 13m

IMDb: 8.1/10   Rotten tomatoes:85%

Judging by the high ratings most people liked it. I must admit - I am not a big Star Wars fan. The first 3 films were great back in the days. But then Stars Wars have become demonstrations of what can be achieved with the latest computer graphics advancements. The Rogue One was not different in my view. It's a prequel to the first 3 movies and it explains certain parts of the overall story. It's fascinating to see how authors/script writers use their imagination and knowledge of the Star Wars universe to connect the new characters and story lines with the existing ones from the previous movies. It's an eye candy for sure - amazing graphics. If you are a Star Wars fan, you would certainly want to have it in your collection. If you are like me, then watching it once would be the right thing to do.

There are also 2 other movies that haven't been released yet, which I would like to share with you. I hope both of them will be exciting to watch.

Courtesy IMDB

"Life" - will be released on the 24th of March 2017
2017 - Fantasy/Science fiction film

IMDb:   Rotten tomatoes

First evidence of extraterrestrial life on Mars... Research like this is happening right now in the real world. Remember the "follow the water" mantra? Mars orbiters have spotted traces of methane gas. Methane is unstable - it cannot be there for a long time. It needs to be replenished, but where does it come from? Methane can have both geological and biological origins.

Courtesy NASA

Mars rovers with each generation carry more and more tools including the ones to perform wet chemistry science. They are looking for organic matter, complex molecules and perhaps even simple microbial life.

I am excited about the movie that explores this one of the most important space exploration tasks of the 21st century.

Courtesy IMDB

"Valerian and the City of a Thousand Planets" - will be released on the 21st of July 2017
2017 - Science fiction film/Action - 2h 9m

IMDb:   Rotten tomatoes

28th century... Time traveling agent... Galactic empire... Based on the French comic series - it's got all the right ingredients to be a great movie. Let's wait and see!

Did you like this list? Have I missed any of the recently released or upcoming movies? Please leave your comments below. Happy watching ;)