Category Archives: Programming

Building Qt 5.15 on Windows with OpenSSL

I have written about the many problems of building Qt 5 with OpenSSL in the past. Several years later, it is time to upgrade to latest Qt 5.15 which is presumably the last in the Qt 5 series. This time I decided to drop the Windows XP support since it is just too much work to get working and XP market share is much lower today than it was 5 years ago.

Since the Qt build documentation is still lacking here is the latest text of the ordeal.

First thing first, install Strawberry Perl and Python 2 (yeah.. really).

For speedier builds, also install jom. We will also need vcpkg to get the OpenSSL binaries. Needless to say, all these tools need to be in your system PATH.

Now get the code:

git clone git://code.qt.io/qt/qt5.git
cd qt5
git checkout v5.15
perl init-repository

I think you can tell the init-repository script to not download some of the modules you don't need (it pulls 12GB of data!) but I couldn't be bothered to find the flags for that. You can probably avoid it by downloading the source archive instead of doing it via git.

Today we finally have the C/C++ package managers available to not bother bulding the dependencies anymore. Vcpkg and conan are both great tools that do the job. So forget about building OpenSSL, just install the binaries with vcpkg:

./vcpkg.exe install openssl

In your user env, add

OPENSSL_LIBS=-llibssl -llibcrypto

Now we can run the configure script inside the qt5 folder. First we do a release build and we link to openssl (run in Visual Studio cmd):

.\configure.bat -v -release -opensource -nomake examples -opengl desktop -platform win32-msvc2015 -openssl -openssl-linked -I C:/Users/me/git/vcpkg/installed/x86-windows/include -L C:/Users/me/git/vcpkg/installed/x86-windows/lib

Change the location of OpenSSL include and lib folders to whatever your vcpkg installation directory is. I am still targeting msvc2015 for now but I plan to transition to 2019 eventually.

If you change around the configure parameters, make sure to delete config.cache file since in my experience it likes to save unwanted information from previous runs.

Build it with jom:

jom
jom install

Now you have a release build and you can add it in Qt Creator under Tools->Options->Qt Versions by giving it the path to C:\Qt\Qt-5.15.1\bin\qmake.exe.

If you also need debug build of Qt, you can repeat the configure and build step by replacing -release flag with -debug (remember to delete config.cache first).

An there you have it.. Qt built from source with OpenSSL support. Once you build your program you will also have to copy all the relavant .dll files into the .exe build directory (Qt5Core.dll, Qt5Gui.dll, Qt5Network.dll…). The amount of libraries you need to copy depends on what you are actually using in your code. You will also need to copy libcrypto-1_1.dll and libssl-1_1.dll from the vcpkg install directory.

For debug builds you need to copy the debug libraries (Qt5Cored.dll) instead.

The future appears to be even brighter now that bincrafters have packaged Qt as a conan recipe. Which means the next time I need to depend on Qt, I will run a single conan command and get the proper build automagically delivered to my PC along with all the transitive dependencies. The future is now.

 

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Debugging Laravel in Eclipse PDT

I don't use PHP enough to justify buying a PHPStorm license so I am using Eclipse PDT instead. I am a bit rusty with Eclipse and PHP so I couldn't really find anything on Google about debugging Laravel projects in Eclipse. Finally figured it out, here is how.

Examples are done on Eclipse IDE Version: 2019-12 (4.14.0).

First, configure XDebug with Eclipse. On Fedora you can install it via

sudo dnf install php-xdebug

Check that XDebug remote is enabled with phpinfo() test site, if not add the following line to your php.ini:

xdebug.remote_enable = 1

Now in Eclipse, we first add a server. In Window->Preferences->PHP->Servers add a new server like this:

Document root is our Laravel public folder and base URL is the default host and port of

php artisan serve

Now check your Debug settings in PHP->Debug, select the newly created server and check that XDebug is set as the debugger:

If XDebug is not present here, configure it under PHP->Debug->Debuggers first.

Finally, under General->Web Browser, we select an external web browser to launch our website instead of integrated Eclipse Browser.

We are done with Preferences so close it. Next to the Debug button in main Eclipse toolbar, click on the arrow for the dropdown and select Debug configurations…

Create a new PHP Web Application config like this.

We point the file to public index and map it to root URL (default by artisan serve). Under Debugger tab check that XDebug is selected.

Now go to the terminal and serve your laravel app as you would with

php artisan serve

Finally, run the "web" Debug configuration from Eclipse. Eclipse should go into the Debug mode and open up your site in your selected browser. You can now place your breakpoints in controllers or wherever and things just work like you would expect.

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Apache http to https redirect – use 307

Who knew that a simple thing like HTTP redirects would be so complicated? It turns out clients will just change POST to GET on 301 (Postman, curl, everyone?), same with 302 which really behaves like 303 and that is also an old implementation "bug". Yeah, seriously.

If you have a REST API with POST (or other non-GET) request endpoints (who doesn't?) this behaviour will completely destroy everything.  Many guides (top google results) out there for configuring Apache redirect do not mention this problem. The code of choice would be 308 Permanent Redirect but that is fairly new so I would not risk it, older clients don't know it exists. The only thing left is 307 which does not allow changing methods on redirect – exactly how it should be.

Solution:

<VirtualHost *:80>
    ServerName example.com
    Redirect 307 / https://example.com/
</VirtualHost>

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Setting env variables with hyphen and running a program

Docker compose allows you very unrestrictive naming of your environment variables. It allows you to use hyphen and other "special" characters in variables names. When you need to use these variables in regular shell you are out of luck, bash and many other shells do not allow hyphens in variable names. But this is merely a shell restriction, so how to do it?

With env

env -i 'TZ=Europe/Berlin' \
'PORT=8080' \
'BASE-URL=http://localhost:8080' \
'DB[0]_CONNECTION-URL=jdbc:postgresql://localhost:5432/postgres' \
'DB[0]_USERNAME=username' \
'DB[0]_PASSWORD=password' java -jar myapp.jar

Note that env ignores all inherited env variables so you might need to redefine them:

env -i JAVA_HOME=$JAVA_HOME \
'TZ=Europe/Berlin' \ 
'PORT=8080' java -jar myapp.jar

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Obscure IntelliJ IDEA "bug" with maven jdk profile activation "not working"

Since Java 9 it is popular to activate additional dependencies which were removed from the core JDK through maven profile.

<profiles>
    <profile>
        <id>java9-modules</id>
        <activation>
            <jdk>[9,)</jdk>
        </activation>
        <dependencies>
            <dependency>
                <groupId>javax.xml.bind</groupId>
                <artifactId>jaxb-api</artifactId>
                <version>2.3.1</version>
            </dependency>
        </dependencies>
    </profile>
</profiles>

Using Java 11 , jaxb-api would correctly show in maven dependency tree and Docker packaged application would work correctly with the dependency jar in the classpath.

However, when running the app from IntelliJ it would fall apart with

Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/bind/annotation/XmlRootElement

Opening module dependencies in IDE would show that jaxb-api is not on the list of dependencies. IntelliJ is therefore not activating the maven profile correctly even though:

  • maven compiler release is set to 11
  • project and Module SDK is set to Java 11
  • app is run with Java 11

Why is that? There is this snippet in IntelliJ Maven Profiles documentation:

If you use a profile activation with the JDK condition (JDK tags in the POM: <jdk></jdk>), IntelliJ IDEA will use the JDK version of the Maven importer instead of the project's JDK version when syncing the project and resolving dependencies. Also, if you use https certificates, you need to include them manually for the Maven importer as well as for the Maven runner.

Why IntelliJ developers decided to tie the maven profile activation to importer I do not know. It would make much more sense to tie it to Project/Module SDK. If app is being developed with Java 11 target one would expect to activate that profile at build and runtime, not at import time.

With more digging around I managed to find an issue complaining about this problem. Unfortunately the issue is 4 years old now with no apparent activity. Preferrably the default should be changed, if not at least give us an option to choose the source of profile activation in preferences.

 

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Receive only the data your client needs – full dynamic JSON filtering with Jackson

A lot of times JSON returned by your REST API grows to incredibly big structures and data sizes due to business logic complexity that is added over time. Then there are API methods returning a list of objects which can be huge in size. If you serve multiple clients, each one can have different demands on what is and is not needed from that data so backend can't decide on it's own what to prune and what to keep. Ideally, backend would always return full JSON by default but allow clients to specify exactly what they want and have backend adjust the response accordingly.  We can achieve this using the power of Jackson library.

Goal:
– allow REST API clients to decide on their own which parts of JSON to receive (full JSON filtering)

Resources for this tutorial:
– Microprofile or JakartaEE platform (JAX-RS)
– Jackson library
– Java classes (lib) representing your API responses which are serialized to JSON
– some custom code to bring things together

The lib module

First lets define a few classes which represent our JSON responses.

public class Car {

  private Engine engine;

  private List<Wheel> wheels;

  private String brand;

 //Getters and setters..
}

public class Wheel {

  private BigDecimal pressure;

  //Getters and setters..
}

public class Engine {
  
  private int numOfCylinders;

  private int hp;

  //Getters and setters..
}

Our lib serialized to JSON would look something like this:

{
    "engine": {
        "numOfCylinders": 4,
        "hp": 180
    },
    "wheels": [
        {
            "pressure": 30.2
        },
        {
            "pressure": 30.1
        },
        {
            "pressure": 30.0
        },
        {
            "pressure": 30.3
        }
    ],
    "brand": "Jugular"
}

Let's say one of our clients only needs the engine horse power and brand information. We want to be able to specify a query parameter like filter=car:engine,brand;engine:hp and receive the following:

{
    "engine": {
        "hp": 180
    },
    "brand": "Jugular"
}

Step in Jackson

Jackson provides an annotation for such tasks called @JsonFilter. This annotation expects a filter name as a parameter and a named filter must be applied to serialization mapper, for example:

FilterProvider filters = new SimpleFilterProvider()
.addFilter("carFilter", SimpleBeanPropertyFilter.filterOutAllExcept("wheels"));      
String jsonString = mapper.writer(filters)...

As you can see, all we need is already there but is a rather static affair. We need to take this and make it fully dynamic and client driven.

The reason filter needs a name is because each one is bound to a class and attribute filtering is done on that class. What we need to do is transform car:engine,brand into a carFilter and SimpleBeanPropertyFilter.filterOutAllExcept("engine", "brand").

For starters, lets add the filters to our classes:

@JsonFilter("carFilter")
public class Car {}

@JsonFilter("engineFilter")
public class Engine {}

@JsonFilter("wheelFilter")
public class Wheel {}

There is one thing about this that bothers me.. the filter name is a static String so it is refactor unfriendly if class name changes some day. Couldn't we just name the filters by taking a look at the name of the underlying class? Yes we can, by extending Jackson introspection:

public class MyJacksonAnnotationIntrospector extends JacksonAnnotationIntrospector {

    @Override
    public Object findFilterId(Annotated a) {
        JsonFilter ann = _findAnnotation(a, JsonFilter.class);
        if (ann != null) {
            String id = ann.value();
            if (id.length() > 0) {
                return id;
            }
            else {
                try {
                    //Use className+Filter as filter ID if ID is not set, e.g. Car -> carFilter
                    Class<?> clazz = Class.forName(a.getName());
                    return StringUtils.uncapitalize(clazz.getSimpleName())+"Filter";
                } catch (ClassNotFoundException e) {
                    e.printStackTrace();
                }
            }
        }
        return null;
    }
}

With this, any class annotated with @JsonFilter("") will automatically get a filter called classNameFilter. We no longer need to specify filter names and keep them in sync with class names.

Our lib now looks like:

@JsonFilter("")
public class Car {}

@JsonFilter("")
public class Engine {}

@JsonFilter("")
public class Wheel {}

Next step is to transform and apply the query parameters into our filter structure.

First, register a Jackson provider for JAX-RS server:

@Provider
public class JacksonProvider extends JacksonJsonProvider implements ContextResolver<ObjectMapper> {

    private final ObjectMapper mapper;

    public JacksonProvider() {
        mapper = new ObjectMapper();
        mapper.registerModule(new JavaTimeModule());
        mapper.setFilterProvider(new SimpleFilterProvider().setFailOnUnknownId(false));
        mapper.setAnnotationIntrospector(new MyJacksonAnnotationIntrospector());
    }

    @Override
    public ObjectMapper getContext(Class<?> type) {
        return mapper;
    }
}

We register our own introspector and disable failures on unknown filters (in case client filters by something nonexisting).

Provider must be registered in your rest Application.

@ApplicationPath("")
public class MyApplication extends Application {

    @Override
    public Set<Class<?>> getClasses() {

        Set<Class<?>> classes = new HashSet<>();

        classes.add(JacksonProvider.class);

        return classes;
    }
}

Finally, we implement our own MessageBodyWriter to override the default serialization and apply the filters dynamically.

@Provider
@Produces(MediaType.APPLICATION_JSON)
public class JsonFilterProvider implements MessageBodyWriter<Object> {

    @Context
    private UriInfo uriInfo;

    @Context
    private JacksonProvider jsonProvider;

    public static final String PARAM_NAME = "filter";

    public boolean isWriteable(Class<?> aClass, Type type, Annotation[] annotations, MediaType mediaType) {
        return MediaType.APPLICATION_JSON_TYPE.equals(mediaType);
    }

    public long getSize(Object object, Class<?> aClass, Type type, Annotation[] annotations,
                        MediaType mediaType) {
        return -1;
    }

    public void writeTo(Object object, Class<?> aClass, Type type, Annotation[] annotations,
                        MediaType mediaType, MultivaluedMap<String, Object> stringObjectMultivaluedMap,
                        OutputStream outputStream) throws IOException, WebApplicationException {

        String queryParamValue = uriInfo.getQueryParameters().getFirst(PARAM_NAME);
        if (queryParamValue!=null && !queryParamValue.equals("")) {

            SimpleFilterProvider sfp = new SimpleFilterProvider().setFailOnUnknownId(false);

            //We link @JsonFilter annotation with dynamic property filter
            for (Map.Entry<String, Set<String>> entry : getFilterLogic(queryParamValue).entrySet()) {
                sfp.addFilter(entry.getKey() + "Filter", SimpleBeanPropertyFilter.filterOutAllExcept(entry.getValue()));
            }

            jsonProvider.locateMapper(aClass, mediaType).writer(sfp).writeValue(outputStream, object);
        }
        else {
            jsonProvider.locateMapper(aClass, mediaType).writeValue(outputStream, object);
        }
    }

    //Map of object names and set of fields
    private Map<String, Set<String>> getFilterLogic(String paramValue) {
        // ?jsonFilter=car:engine,brand;engine:numOfCylinders
        String[] filters = paramValue.split(";");

        Map<String, Set<String>> filterAndFields = new HashMap<>();

        for (String filterInstance : filters) {
            //car:engine,brand
            List<String> pair = Arrays.asList(filterInstance.split(":"));
            if (pair.size()!=2) {
                throw new RuntimeException();
            }

            Set<String> fields = new HashSet<>(Arrays.asList(pair.get(1).split(",")));
            filterAndFields.put(pair.get(0), fields);
        }

        return filterAndFields;
    }
}

getFilterLogic method assembles the query parameter structure into a map of <String className, Set<String> fields> which is then applied as a Jackson filter.

Finally, we need to register our JsonFilterProvider in our Application as we did with JacksonProvider.

@ApplicationPath("")
public class MyApplication extends Application {

    @Override
    public Set<Class<?>> getClasses() {

        Set<Class<?>> classes = new HashSet<>();

        classes.add(JacksonProvider.class);
        classes.add(JsonFilterProvider.class);

        return classes;
    }
}

One small deficiency with this solution is that once you specify a class with fields to filter, it will be filtered wherever in the nested JSON structure it appears, you can't just filter a specific class at a specific level. Realistically, I think this is a rather minor problem compared to the benefits and the simplicity of the implementation.

Finally a question on documentation. How do you tell the client developer about all the possible filter object names and their attributes? If you use OpenAPI you are 95% there. Simply document that you can filter by model name followed by attribute name. Client developer can easily figure out the names from your OpenAPI specification. The only remaining problem is when you don't want to allow filtering on all classes. In this case my approach would be to document a filterable class in OpenAPI description:

@ApiModel(description = "[Filterable] A car.")

This manual approach of documenting goes against the rest of the paradigm so a real purist would write an OpenAPI extension that would introspect all @JsonFilter annotations and modify the descriptions automatically. But let's leave that for a future blog post.

 

A similar, more advanced and out-of-the-box solution is squiggly, which also uses Jackson under the hood.

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

HTPP Accept-Language request header to ResourceBundle

HTTP Accept-Language header is specified by the client to inform the backend what the preferred language for the response is. In Java, the go-to utility for handling localization is ResourceBundle.

What is missing is a standard way to properly convert the input header to the correct ResourceBundle. Specifically,

ResourceBundle i18n = ResourceBundle.getBundle("bundles/translations", request.getLocale());

is insufficient. HttpServletRequest::getLocale() method returns the top preferred locale but if no such ResourceBundle exists, it will fall back to default locale instead of going down the priority list. For example, this header:

Accept-Language: de-DE;q=1.0,fr-FR;q=0.9,en-GB;q=0.8

when backend is missing de-DE translations will return the system default (e.g. en-GB) instead of fr-FR which is the second by priority.

Clients don't usually request languages unknown to backend but it is possible in theory and languages can be automatically added by the client platform (iOS does this) without the client knowing.

We need to iterate the locale chain and find the highest match that exists as a bundle.

Below is a sample in JAX-RS environment.
@RequestScoped
public class Localization {

    @Context
    private HttpServletRequest request;

    private ResourceBundle i18n;

    @PostConstruct
    void postConstruct() {
        //List of locales from Accept-Language header
        List<Locale> locales = Collections.list(request.getLocales());

        if (locales.isEmpty()) {
            //Fall back to default locale
            locales.add(request.getLocale());
        }

        for (Locale locale : locales) {
            try {
                i18n = ResourceBundle.getBundle("bundles/translations", locale);
                if (!languageEquals(i18n.getLocale(), locale)) {
                    //Default fallback detected
                    //The resource bundle that was returned has different language than the one requested, continue
                    //Only language tag is checked, no support for detecting different regions in this sample
                    continue;
                }
                break;
            }
            catch (MissingResourceException ignore) {
            }
        }
    }

    private boolean languageEquals(Locale first, Locale second) {
        return getISO2Language(first).equalsIgnoreCase(getISO2Language(second));
    }

    private String languageGetISO2(Locale locale) {
        String[] localeStrings = (locale.getLanguage().split("[-_]+"));
        return localeStrings[0];
    }

    public ResourceBundle i18n() {
        return this.i18n;
    }
}

 

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

That moment when you need to look up definition of C++ for loop

I was getting a segfault on an old piece of code which I maintain. The culprit was pinpointed to this:

bool found = false;
vector<string> :: iterator i;
for (i = v.begin(); !found && i != v.end(); ++i) {
    if (name == *i) {
        found = true;
    }
}
if (found) {
   v.erase( i ); // <-- segfault here
}

I went through this piece if code at least 10 times without noticing the problem. The snippet is simple enough.. when match is found, set found to true and that breaks the loop since loop condition now evaluates to false. The iterator remains at the position of matched element.

WRONG.

What we are actually getting is iterator+1.

What we don't see directly from the code is that increment happens before the condition is evaluated for the next loop, giving us iterator+1 which causes a segfault if match is found on last element.

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs

Keycloak OAuth endpoints for Postman/HTTP Clients

When testing REST services secured by Keycloak you need to retrieve access tokens via Postman or similar REST client. If you want to implement your own client that has to authenticate with a token you also need to know the Keycloak OpenID endpoints in order to retrieve the access token, refresh it or to end the session (logout).

Retreiving the tokens for a public client using username and password

Public client is typically used for web applications and other client side apps.

Method: POST
URL: https://keycloak.example.com/auth/realms/myrealm/protocol/openid-connect/token
Body type: x-www-form-urlencoded
Form fields:
client_id <my-client-name>
grant_type password
username <username>
password <password>

Retreiving the tokens for a confidential client using client secret

Confidential client is typically used for secure apps on the back-end.

Method: POST
URL: https://keycloak.example.com/auth/realms/myrealm/protocol/openid-connect/token
Body type: x-www-form-urlencoded
Form fields:
client_id <my-confidential-client-name>
grant_type client_credentials
client_secret <my-client-secret>

Retreive an access token with a refresh token

The first two methods will yield you an access token which you use in the Authorization HTTP header and a refresh token which you save for later. Refresh tokens have much longer expire time as access tokens. The idea is that when the access token expires you use the refresh token to get a new access token. This request also gives you a new refresh token so you can keep the session alive until maximum refresh token expire time is reached. Refresh token expire time equals the session expire time.

Method: POST
URL: https://keycloak.example.com/auth/realms/myrealm/protocol/openid-connect/token
Body type: x-www-form-urlencoded
Form fields:
client_id <my-client-name>
grant_type refresh_token
refresh_token <my-refresh-token>

Logout the session

To logout and invalidate the session, call a /logout endpoint with your refresh token. The validity of the refresh token is essentially the validity of your entire session.

Method: POST
URL: https://keycloak.example.com/auth/realms/myrealm/protocol/openid-connect/logout
Body type: x-www-form-urlencoded
Form fields:
client_id <my-client-name>
refresh_token <my-refresh-token>

Cen
GitHub
Eurobattle.net
Lagabuse.com
Bnetdocs