which classes to serialize in spark

I am a bit confused on which Java classes to make Serializable in a Spark application.

As of now my application is working with no exceptions. I have only made one class Serializable, this class has got code for Spark Streaming , transformations
(like flatMap) , and actions like forEachRDD etc.

But there are some classes that i am instantiating inside this class and calling their methods, and these are not serializable. I am running on one driver and one worker node.

Can anybody shed some light on this? Thanks.

As for the below code, SparKInitClass is the starting point of spark, and it calls operate() method of Main class to do some work.

Main class uses two classes (OneClass, and TwoClass) to do the work
Now which all needs to be serailized? (Main , OneClass, TwoClass)?

class SparKInitClass{
SparkSession ss = ...create SparkSession;
new Main().operate();
}

class Main {

public void operate(){
  JavaInputDStream<B> messages = KafkaUtils.createDirectStream(
                jssc,
                LocationStrategies.PreferConsistent(),
                ConsumerStrategies.<A, B>Subscribe(topicsSet, kafkaParams));

  OneClass oc = new OneClass();

  messages.forEachRDD(record -> {

    TwoClass tc= new TwoClass();
    tc.compute();

    oc.add();
  });
}  //operate ends here

} //Main class ends here