[Soot-list] Spark and custom entry points

Marc-Andre Laverdiere-Papineau marc-andre.laverdiere-papineau at polymtl.ca
Wed Mar 6 12:18:09 EST 2013


Hello Michael,

I have been working on Web services and Servlets with Bernhard Berger, 
so we both got some good insights so far.

Were you thinking of doing IFDS analyses? If so, one 'trick' is that you 
can make the calls to your library in your dummy main in a loop that 
looks like this:

while (true){
switch (random.nextInt())
case 1: a()
case 2: b()
...
case n: return;
}
That way, the solver will operate without any assumption about the 
ordering of the operations.

If you want to model some life cycles, then that is a bit more work, but 
still doable in this pattern.

Marc-André Laverdière-Papineau
Doctorant - PhD Candidate

On 13-03-06 11:58 AM, Michael Faes wrote:
> Wow, that is a big bummer. If this is truly the case, then I think it's
> absolutely be necessary that this is stated somewhere.
>
> I also already thought about generating dummy methods. The problem is
> that I'm analyzing libraries and that my analysis should consider all
> possible uses of a library. So every method could be called with any set
> of parameters. To generate a main method that reflects this usage is not
> possible, I think.
>
> If someone has another idea, please let me know.
>
> Thanks,
> Michael
>
> -------- Original-Nachricht --------
> Betreff: Re: [Soot-list] Spark and custom entry points
> Von: Marc-Andre Laverdiere-Papineau
> <marc-andre.laverdiere-papineau at polymtl.ca>
> An: soot-list at sable.mcgill.ca
> Datum: 06.03.2013 17:14
>
>> Hello,
>>
>> I worked for a while on a custom entry point framework, and hit the same
>> brick wall.
>>
>> Spark doesn't reason well when the entry points are not static, so the
>> trick is to generate a dummy main that creates the instances of your
>> classes and calls them.
>>
>> Marc-André Laverdière-Papineau
>> Doctorant - PhD Candidate
>>
>> On 13-03-06 11:11 AM, Michael Faes wrote:
>>> Hi again,
>>>
>>> Using the information in Quentin's script I was able to build the
>>> develop branch of soot. It took me quite some time as the whole build
>>> procedure is not really compatible to Windows, even with a Cygwin
>>> environment. But it worked in the end, so thanks!
>>>
>>> However, I encountered another problem. Using Spark with custom entry
>>> points seems not to work at all. Using CHA, this simple class:
>>>
>>> public class CallGraphTest {
>>>
>>>        private final Object object;
>>>
>>>        public CallGraphTest(final Object object) {
>>>            this.object = object;
>>>        }
>>>
>>>        @Override
>>>        public int hashCode() {
>>>            return object.hashCode();
>>>        }
>>> }
>>>
>>> produces a reasonable call graph with about 90 edges. Using Spark, the
>>> call graph is plain empty. As mentioned before, I'm setting up Soot to
>>> use all public methods as entry points.
>>>
>>> Now, I checked the mailing list archive and found this:
>>>
>>> http://www.sable.mcgill.ca/pipermail/soot-list/2011-December/003983.html
>>>
>>> It suggests that Spark may have problems with non-static entry points.
>>> Is this still the case? Is there a way around this problem?
>>>
>>> Thanks again for your help.
>>> Michael
>>> _______________________________________________
>>> Soot-list mailing list
>>> Soot-list at sable.mcgill.ca
>>> http://mailman.cs.mcgill.ca/mailman/listinfo/soot-list
>>>
>> _______________________________________________
>> Soot-list mailing list
>> Soot-list at sable.mcgill.ca
>> http://mailman.cs.mcgill.ca/mailman/listinfo/soot-list
>>
>>


More information about the Soot-list mailing list