How to get a rock star supervisor


The Thesis Whisperer

How do you choose the right supervisor? How do you know if it might be time for a change?

In this post Associate Professor Evonne Miller offers a check list of qualities of an awesome supervisor. I now blog with Evonne over at The Supervision Whisperers where the tagline is “Just like the Thesis Whisperer, but with more paperwork”.

Evonne is the Director of Research Training for the Creative Industries Faculty at Queensland University of Technology in Brisbane, Australia. She detests meetings and leans towards the hands-off supervision style, but her students will attest that she is passionate about their research and does yell at them (kindly) when needed.

screen-shot-2017-02-05-at-9-47-09-amWhether it is art, science or a little bit of magic, choosing the ‘right’ phd supervisor is one of the most important decisions you will make. There is no doubt that a little bit of luck (or magic) is involved, and both…

View original post 1,136 more words

Doctoral Symposium: What, Why and How?


Wylliams Barbosa Santos

The aim of the doctoral symposium is to give PhD students the opportunity to present their research to receive constructive feedback from a panel of senior researchers in a specific area. The doctoral symposium is run in a highly interactive with a workshop format. Aiming to obtain maximum benefit from doctoral symposium, students should consider participating after they have settled on a research topic, with a defined problem statement and some ideas about the solution that they want to discuss.

“Do you have any previous experiences with Doctoral Symposium? Share with us!!”

Objetives of Doctoral Symposium

  1. Present their research work in a relaxed and supportive environment;
  2. Receive feedback and suggestions from peers and experienced faculty;
  3. Gain an overview of the breadth and depth of research;
  4. Obtain insight into directions for research taken by other doctoral candidates;
  5. Discuss concerns about research, supervision, the job market, and other issues;
  6. Network with peers and…

View original post 372 more words

How can I analyze a data from a qualitative research?


Interesting post from my friend Fernando Kenji!

Fernando Kenji Kamei

Which direction to go? Which direction to go?

I have studied years ago about Systematic Review and Qualitative Research during my master degree. You can read a diversity of material about these in the technical reports, articles and books, which explain how to conduct, how to extract data, but a little focusing about how to analyze and synthetize the data. So, the big question is: How we can analyze the qualitative data? [my question now during the PhD]

What kind of method do you prefer? What the difference about the existing methods?

I always listening my teachers saying that to conduct a qualitative research, the better methods to analyze data are Grounded Theory or Thematic Analysis. But, what is a Thematic analysis? What is a Grounded Theory?

Existing in the literature many materials about it, but for the Computer Science I recommend to read:

View original post 129 more words

Surfing on the Waves of Publication!


In [1], Meyer (2013) criticizes the current scientific publication methods. He believes that with the technological evolution it is possible to transform the articles in dynamic documents that can evolve over the time. In other words, the publishing process should also evolve. For this reason, the author presents the waves of publication.

As stated by Meyer, “The very notion of publication has changed. The process part is gone; only the result remains, and that result can be an evolving product, not a frozen artifact.”

Moreover, the author suggests to start with a blog post, then register the work first version as a technical report (usually not considered prior publication), then submit it to a workshop, then to a conference, and finally, a version of record in a journal (extended version of a conference paper including at least 30% of new material). The Figure below summarize the publication evolution process (waves of publication).

Image

In blog posts, the writer audience work as “reviewers” through comments. They contributes with different perspectives and point of viewers. The drawback is the feedback low reliability. On the other hand, in technical reports,  the researcher advisor (or responsible) reviews the work. Generally, there are institutions/organizations involved in the process.

In Workshops, (at least 2 or 3) reviewers will evaluate the paper. We can consider as the “first” external evaluation. Although conferences and journals are harder to publish, the reviewers contribution are important to evolve the work. Furthermore, there is a chief editor to coordinate the revision.

According to Meyer, “there is a whole gradation of prestige, well known to researchers in every particular field: conferences are better than workshops, some journals are as good as conferences or higher, some conferences are far more prestigious than others, and so on” [1].

We support the idea that publication should evolve along with the research and the journal should combine different kind of studies to allow a broader view of the work. Hence, we suggest the combination of the waves of publication with Mafra et al. methodology [2] as an iterative and incremental publication process (see the Figure below).

Image

According to Kitchenham (2007), “A systematic literature review (SLR) is a means of identifying, evaluating and interpreting all available research relevant to a particular topic area” [3]. SLR will serve as basis for defining the state of the art. The focus is Evidence-Based Software Engineering conferences.

Since the workshop is the first contact of the work with the research community, the initial ideal  can be submitted to a Software Engineering workshop. Once the idea is establish, the work is prepared to conference evaluation. The Technology Development includes the development of: tools, processes, frameworks, models, guidelines, approaches, and so on.

Moreover, the researcher should execute the Empirical Evaluation of the Technology. It includes controlled experiments, case studies, qualitative research, interviews, etc. Finally, the different papers should be combined in a journal paper by adding more 30% of new information.

References

[1] B. Meyer: Communications of the ACM Blog: The Waves of Publication, January 2013, available in:  ACM Blog.

[2] S. Mafra, R. Barcelos, G. Travassos, Aplicando uma Metodologia Baseada em Evidência na Definição de Novas Tecnologias de Software (Portuguese), in: XX Brazilian Symposium on Software Engineering (SBES), 2006.

[3] B. Kitchenham, S. Charters, Guidelines for performing Systematic Literature Reviews in Software Engineering, in: Keele University and Durham University Joint Report, 2007

Research Type Facet


Classes Description
Validation Research Tools investigated are novel and have not yet been implemented in practice. Tools used are for example experiments, i.e., work done in the lab.
Evaluation Research Tools are implemented in practice and an evaluation of the tool is conducted. That means, it is shown how the tool is implemented in practice (solution implementation) and what are the consequences of the implementation in terms of benefits and drawbacks (implementation evaluation). This also includes identification of problems in industry.
Solution Proposal A solution for a problem is proposed, the solution can be either novel or a significant extension of an existing tool. The potential benefits and the applicability of the solution is shown by a small example or a good line of argumentation.
Philosophical Paper These papers sketch a new way of looking at existing things by structuring the field in form of a taxonomy or conceptual framework.
Opinion Paper These papers express the personal opinion of somebody whether a certain tool is good or bad, or how things should be done. They do not rely on related work and research methodologies.
Experience Paper Experience papers explain what and how something has been done in practice. It has to be the personal experience of the author.

List of Journals and Conferences


List of Journals

Journals

ACM Computing Survey
ACM Transactions on Software Engineering and Methodology (TOSEM)
Annals of Software Engineering
Automated Software Engineering
ELSEVIER Information and Software Technology (IST)
ELSEVIER Journal of Systems and Software (JSS)
IEEE Software
IEEE Transactions on Software Engineering
Journal of Systems and Software
Software Process: Improvement and Practice
Software Practice and Experience
Journal of Software Maintenance Research and Practice
Journal of Systems and Software (JSS)
Software Practice and Experience (SPE) Journal
Software Quality Journal
Software Testing, Verification and Reliability

List of Conferences

Acronym Conference Name
APSEC Asia Pacific Software Engineering Conference
ASE International Conference on Automated Software Engineering
CAiSE International Conference on Advanced Information Systems Engineering
CBSE International Symposium on Component-based Software Engineering
COMPSAC International Computer Software and Applications Conference
ECBS International Conference and Workshop on the Engineering of Computer Based Systems
ECOWS European Conference on Web Services
ECSA European Conference on Software Architecture
ESEC European Software Engineering Conference
ESEM Empirical Software Engineering and Measurement
FASE Fundamental Approaches to Software Engineering
ICCBSS International Conference on Composition-Based Software Systems
ICSE International Conference on Software Engineering
ICSM International Conference on Software Maintenance
ICSR International Conference on Software Reuse
ICST International Conference on Software Testing, Verification and Validation
ICWS International Conference on Web Services
ISSRE International Symposium on Software Reliability Engineering
GPCE International Conference on Generative Programming and Component Engineering
MODEL International Conference on Model Driven Engineering Languages and Systems
MoTiP Workshop on Model-based Testing in Practice
OOPSLA ACM SIGPLAN conference on Object-Oriented Programming, Systems, Languages, and Applications
PROFES International Conference on Product Focused Software Development and Process Improvement
QoSA International Conference on the Quality of Software Architectures
QSIC International Conference on Quality Software
ROSATEA International Workshop on The Role of Software Architecture in Testing and Analysis
SAC Annual ACM Symposium on Applied Computing
SEAA Euromicro Conference on Software Engineering and Advanced Applications
SEKE International Conference on Software Engineering and Knowledge Engineering
SPLC Software Product Line Conference
SPLiT Software Product Line Testing Workshop
TAIC PART Testing: Academic and Industrial Conference
TEST International Workshop on Testing Emerging Software Technology
WICSA Working IEEE/IFIP Conference on Software Architecture
WS-Testing International Workshop on Web Services Testing