Skip to content
Advertisement

Possibility to explicit remove Serialization support for a lambda

As already known it’s easy to add Serialization support to a lambda expression when the target interface does not already inherit Serializable, just like (TargetInterface&Serializable)()->{/*code*/}.

What I ask for, is a way to do the opposite, explicitly remove Serialization support when the target interface does inherit Serializable.

Since you can’t remove an interface from a type a language-based solution would possibly look like (@NotSerializable TargetInterface)()->{/* code */}. But as far as I know, there is no such solution. (Correct me if I’m wrong, that would be a perfect answer)

Denying the serialization even when the class implements Serializable was a legitimate behavior in the past and with classes under the programmers control, the pattern would look like:

public class NotSupportingSerialization extends SerializableBaseClass {
    private void writeObject(java.io.ObjectOutputStream out) throws IOException {
      throw new NotSerializableException();
    }
    private void readObject(java.io.ObjectInputStream in)
      throws IOException, ClassNotFoundException {
      throw new NotSerializableException();
    }
    private void readObjectNoData() throws ObjectStreamException {
      throw new NotSerializableException();
    }
}

But for lambda expression, the programmer doesn’t have that control over the lambda class.


Why would someone ever bother about removing the support? Well, beside the bigger code generated to include the Serialization support, it creates a security risk. Consider the following code:

public class CreationSite {
    public static void main(String... arg) {
        TargetInterface f=CreationSite::privateMethod;
    }
    private static void privateMethod() {
        System.out.println("should be private");
    }
}

Here, the access to the private method is not exposed even if the TargetInterface is public (interface methods are always public) as long as the programmer takes care, not to pass the instance f to untrusted code.

However, things change if TargetInterface inherits Serializable. Then, even if the CreationSite never hands out an instance, an attacker could create an equivalent instance by de-serializing a manually constructed stream. If the interface for the above example looks like

public interface TargetInterface extends Runnable, Serializable {}

it’s as easy as:

SerializedLambda l=new SerializedLambda(CreationSite.class,
    TargetInterface.class.getName().replace('.', '/'), "run", "()V",
    MethodHandleInfo.REF_invokeStatic,
    CreationSite.class.getName().replace('.', '/'), "privateMethod",
    "()V", "()V", new Object[0]);
ByteArrayOutputStream os=new ByteArrayOutputStream();
try(ObjectOutputStream oos=new ObjectOutputStream(os)) { oos.writeObject(l);}
TargetInterface f;
try(ByteArrayInputStream is=new ByteArrayInputStream(os.toByteArray());
    ObjectInputStream ois=new ObjectInputStream(is)) {
    f=(TargetInterface) ois.readObject();
}
f.run();// invokes privateMethod

Note that the attacking code does not contain any action that a SecurityManager would revoke.


The decision to support Serialization is made at compile-time. It requires a synthetic factory method added to CreationSite and a flag passed to the metafactory method. Without the flag, the generated lambda will not support Serialization even if the interface happens to inherit Serializable. The lambda class will even have a writeObject method like in the NotSupportingSerialization example above. And without the synthetic factory method, De-Serialization is impossible.

This leads to the one solution, I found. You can create a copy of the interface and modify it to not inherit Serializable, then compile against that modified version. So when the real version at runtime happens to inherit Serializable, Serialization will still be revoked.

Well, another solution is to never use lambda expressions/method references in security relevant code, at least if the target interface inherits Serializable which must always be re-checked, when compiling against a newer version of the interface.

But I think there must be better, preferably in-language solutions.

Advertisement

Answer

How to handle serializability was one of the biggest challenges for the EG; suffice it to say that there were no great solutions, only tradeoffs between various downsides. Some parties insisted that all lambdas be automatically serializable (!); others insisted that lambdas never be serializable (which seemed an attractive idea at times, but sadly would badly violate user expectations.)

You note:

Well, another solution is to never use lambda expressions/method references in security relevant code,

In fact, the serialization spec now says exactly that.

But, there is a fairly easy trick to do what you want here. Suppose you have some library that wants serializable instances:

public interface SomeLibType extends Runnable, Serializable { }

with methods that expect this type:

public void gimmeLambda(SomeLibType r)

and you want to pass lambdas into it, but not have them be serializable (and take the consequences of that.) So, write yourself this helper method:

public static SomeLibType launder(Runnable r) {
    return new SomeLibType() {
        public void run() { r.run(); }
    }
}

Now you can call the library method:

gimmeLambda(launder(() -> myPrivateMethod()));

The compiler will convert your lambda into a non-serializable Runnable, and the laundering wrapper will wrap it with an instance that satisifies the type system. When you try to serialize it, that will fail since r is not serializable. More importantly, you can’t forge access to the private method, because the $deserializeLambda$ support that’s needed in the capturing class won’t even be there.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement