Summary of ICSE 2011 Panel on "What Industry wants from Research". This is a summary of all the presentations from that panel that I presented in an invited talk at the CSER meeting in Toronto, November, 2011.
Injustice - Developers Among Us (SciFiDevCon 2024)
Research industry panel review
1. Highlights
from
an
ICSE
2011
Panel:
What
Industry
Wants
from
Research
Panel
Organizers:
Jorge
Aranda,
Daniela
Damian,
Marian
Petre,
Margaret-‐Anne
Storey,
Greg
Wilson
1
4. QuesEons
we
asked…
What
is
your
percep0on
of
soMware
engineering
research?
Conferences,
journals,
collaboraEons…
What
counts
as
evidence?
How
do
you
find
out
about
results?
What
topics
should
soMware
engineering
research
focus
on?
4
5. Percep0ons
of
so7ware
research
Focused
on
organiza0ons
that
don’t
compare
to
us
Results
may
fail
to
scale
Piecemeal
improvements
Irrelevant,
dated,
ignoring
cuWng
edge
problems
hXp://Enyurl.com/icse2011panel
5
6. “[I’m
afraid]
that
industrial
so9ware
engineers
will
think
that
I’m
now
doing
academic
so9ware
engineering
and
then
not
listen
to
me.
(…)
if
I
start
talking
to
them
and
claim
that
I’m
doing
so9ware
engineering
research,
a9er
they
stop
laughing,
they’re
gonna
stop
listening
to
me.
Because
it’s
been
so
long
since
anything
actually
relevant
to
what
pracFFoners
do
has
come
out
of
that
environment,
or
at
least
the
percentage
of
things
that
are
useful
that
come
out
of
that
environment
is
so
small.”
6
7. What
counts
as
evidence?
Preference
for
quanEtaEve
data
and
staEsEcal
significance
7
8. What
counts
as
evidence?
Preference
for
quanEtaEve
data
and
staEsEcal
significance
“managers
are
coin
operated
in
some
sense.
If
you
can’t
quanFfy
it
in
terms
of
Fme
or
in
terms
of
money,
it
doesn’t
make
much
difference
to
them.
(…)
I
think
there
does
need
to
be
some
noFon
of
a
numeric
or
at
least
an
objecFve
measure.”
8
9. What
counts
as
evidence?
Preference
for
quanEtaEve
data
and
staEsEcal
significance
AssociaEon
of
qualitaEve
data
with
'mere
anecdote‘
And
yet….
InfluenEal
opinions
persuasive
“I
trust
the
people
I
hire”
9
10. Difficulty
in
applying
our
findings
Resistance
to
change
for
small
potenEal
gains
“(…)
it
would
depend
in
part
of
how
cumbersome
your
techniques
are;
how
much
retraining
I’m
going
to
have
to
do
on
my
staff.
(…)
I
might
decide
that
even
if
you’re
legit
and
you
actually
do
come
up
with
15%,
that
that’s
not
enough
to
jusFfy
it.”
10
11. Dissemina0on
of
results
Results
are
not
geWng
out
there!
SuggesFons:
Researchers
should
be
aXending
pracEEoner
conferences
Need
to
disEll
research
results
for
pracEEoners
11
12. Research
topics
of
interest
Developer
issues
Tool
issues
Code
issues
User
Issues
EvaluaEon
Issues
Management
issues
12
15. QuesEons
we
posed
to
the
panelists…
What
percep0on
does
industry
have
of
academic
research?
What
kind
of
evidence
is
compelling?
Which
research
ques0ons
should
be
addressed?
Most
useful
empirical
finding?
Success
story?
How
to
improve
dissemina0on
of
research
results?
How
to
engage
industry
in
research?
15
16. Panel
parEcipants
David
Weiss,
Iowa
State
University
John
Penix,
Google
Lionel
Briand,
Simula
Research
Laboratory
Peri
Tarr,
IBM
Thomas
J.
Watson
Research
Center
Tatsuhiro
Nishioka,
Corporate
SoMware
Engineering
Center,
Toshiba
CorporaEon
Wolfram
Schulte,
MicrosoM
Research
16
18. Panel
Highlights:
John
Penix,
Google
• Striving
for
conEnuous
improvement
of
useful
tools:
– “We
can’t
improve
what
we
can’t
measure”
– “Our
goal:
make
the
tools
disappear
from
the
workflow”
• In
terms
of
scale….
5000
devs
in
40
offices,
2000
acEve
projects,
single
monolithic
code
tree
with
mixed
languages,
20+
code
changes
per
minute,
50,000
builds
per
day,
50
million
test
cases
run
per
day….
18
19. Panel
Highlights:
John
Penix
(2)
• Problems
he
cares
about…
– How
do
developers
work
and
collaborate?
– Team
producEvity
vs.
individual
producEvity
• All
MS
students
should
know
how
to
do
a
user
study
• Faculty
advice:
Maintain
contact
with
your
students
(good
links
to
industry)
19
20. Panel
Highlights:
Lionel
Briand,
Simula
Research
Labs
• PercepEons
from
industry:
– Disconnect,
engage
researchers
in
industry
– Scalability
(heurisEcs
versus
exact
methods)
– Applicability
(context
factors,
constraints),
realisEc
condiEons
(human
factors)
– Fundamental
research
quesEons
haven’t
changed…
– Relevant
empirical
findings?
• (Model
based)
tesEng
and
empirical
research
at
MicrosoM,
InspecEons
20
22. Panel
Highlights:
Peri
Tarr,
IBM
Research
• So
you
want
to
marry
an
industrial?
• Success
comes
at
a
high
price!
– Huge
Eme
commitment,
opportunity
cost
– You
become
a
development
resource,
for
beSer
and
for
worse
– Effort
in
developing
trust,
what
you
do
maXers
to
them,
problems
of
today
• Industry
balances
strategies
and
considers
ROI
across
porqolios
– Do
you
add
risk?
– Evidence
that
customers
can
use
it,
want
it,
will
pay
for
it?
22
28. Discussion
Points
• SoMware
development
is
a
wicked
problem…
how
to
compete
with
snake
oil
salesmen
with
our
modest
soluEons?
• How
to
deal
with
kinds
of
evidence
accepted…
many
kinds
of
important
soMware
dev
problems
are
not
amenable
to
controlled
experimentaEon?
• How
to
become
a
beXer
storyteller?
How
to
deal
with
the
apparent
disconnect?
• How
to
do
industrially
relevant
research
without
hurEng
one’s
academic
career?
28