Technical debt is a metaphor that software developers and managers increasingly use to communicate key tradeoffs related to release and quality issues. The Managing Technical Debt workshop series has, since 2010, brought together practitioners and researchers to discuss and define issues related to technical debt and how they can be studied. Workshop participants reiterate the usefulness of the concept each year, share emerging practices used in software development organizations, and emphasize the need for more research and better means for sharing emerging practices and results.
The Ninth Workshop on Managing Technical Debt will bring together leading software researchers and practitioners, especially from the area of iterative and agile software development, for the purpose of exploring theoretical and practical techniques that quantify technical debt.
The following topics are aligned with our theme:
techniques and tools for managing technical debt in agile, DevOps, and other software development environments
techniques and tools for calculating technical debt principal and interest
technical debt in code, design, architecture, and development and delivery infrastructure
Writing a great session proposal for a developers' conference can be difficult, even for experienced public speakers and authors. Proposal writing is a distinct skill, different from writing great papers and giving amazing presentations. Your session proposal is what the reviewers will use to decide whether your session should be in the SATURN Conference technical program. Here are seven tips for writing a session proposal that will make reviewers go from frown to smile.
Include all of the information requested in the Call for Submissions. Whether you are proposing an experience report presentation, participatory session, DEV@SATURN talk, or tutorial, the program committee needs this information to give your proposal a fair and thorough review. There are two fields for conveying this information: the "Description" field and the "Notes" field. The description is the public abstract of your talk. It will be seen by reviewers of your submission and will eventually be seen by the attendees of the event. You should make the description of your talk as compelling and exciting as possible. Remember, you're selling to both the organizers of the event to select your talk, and to conference attendees to come see it. The Notes section will be seen only by reviewers. This is where you should explain things such as technical requirements, why you're the best person to speak on this subject, etc.
Be as specific as possible. The SATURN Program Committee is an experienced group and comes from a wide range of backgrounds. They are excited to hear specifics about what you want to teach us at SATURN and how you will do it.
Include an outline, even if it's only tentative. The "Notes" section is your chance to expand on your abstract. Including an outline shows that you've thought through what you plan to present and helps the reviewers better understand the concepts and lessons you want to share at SATURN. For participatory sessions and tutorials, how will you spend your time?
Tell us who you are. In a few sentences, tell us about your work history, your education, and any past work or research that will help establish your credibility. Reviewers also want to know that you can speak from first-hand experiences. The best way to demonstrate this is with a brief biography, highlighting your experience as a speaker or instructor. If you have access to a video of yourself presenting at a conference, include the link in "Additional Information" so the program committee can see you in action.
Use your abstract to summarize and sell your talk. Why should participants attend? What will they learn? What is your topic and why is it important? A great abstract will describe the topic, summarize the key lessons, and pique interest.
Show how your session will have lasting appeal. Reviewers want to see how your experience report, participatory session, DEV@SATURN talk, or tutorial will improve our overall understanding of software architecture. This could mean many different things, from sharing a completely new idea or experience to explaining a classic concept in a way that no one ever has before. Take a look at the SATURN 2016 and 2017 presentations and videos for some great examples.
Plan ahead and submit early. Submissions close on January 16, 2017. Do not wait until the last minute to prepare your submission. Prepare, take time to review, refine your proposal, and submit early.
Historically, only 20-30% of submissions have been accepted. The conference chairs want to make the program committee's job as difficult as possible by making sure there is an abundance of amazing proposals from which to choose. They don't want any rejection to be easy. They want reviewers fighting to include your presentation, participatory session, or tutorial in the program. Arm them with the information they need to give your topic a thorough review by including all the information requested in the Call, by being as specific as possible, and by planning ahead to allow plenty of time to refine your submission.
Can you help us test improvements to the SEI external web presence?
What is happening with the SEI's external web presence?
The SEI is improving the user experience of our external web presence. We have made improvements to the website information architecture based on previous tree testing and design research and need to validate the success of those improvements.
What is tree testing and how does it work?
Tree testing is an evaluative design research method to assess the organization and discoverability of content on a website. It is deployed using information architecture validation software called Treejack. This allows us to determine whether our changes to the information architecture have improved the user interface.
Since 2010, the SEI and IEEE have been conferring two attendee-selected awards at SATURN. The IEEE Software SATURN Architecture in Practice Presentation Award is given to the presentation that best describes experiences, methods, and lessons learned from the implementation of software architecture practices. This year's award winner was Patrick Kua of ThoughtWorks for his presentation titled Evolutionary Architecture.
The second award, the IEEE Software SATURN New Directions Presentation Award, is given to the presentation that best describes innovative new approaches and thought leadership in the application of software architecture practices. This year's award winner was João de Sousa of Robert Bosch LLC for his presentation titled Going Bezirk: Things Plus Cloud Do Not Equal IoT.
In addition to reflecting the high regard of SATURN attendees, these awards also contribute to the maturation of the practice of software architecture by recognizing sound and innovative practices.
Kent will discuss how one tech-savvy parent who is raising two "digital-native" children is working to prepare the coming generation for the changes and career opportunities that the Internet of Things is bringing to our world. This session will include the kids who are part of Kent's weekly workshops and their IoT projects, which include robots, drones, and Chromebooks. The future of IoT may just rest with these smaller humans, and SATURN attendees will have the opportunity to check out their work.
Here is a preview:
SATURN 2016 will take place May 2-5, 2016 in San Diego, California. Registration is open, and we hope you will choose to participate.
For two decades, the SEI has been instrumental in the creation and development of the field of software engineering known as software architecture. An architect whose skills and capabilities match a project's needs is more likely to be successful. So what are those skills?
Join SEI researchers and an industry colleague in a live-streamed discussion on What Makes a Good Software Architect?
Topics to be covered
John Klein and Andrew Kotov on Skills and Knowledge of Successful Architects
Ipek Ozkaya and Michael Keeling on Architects Design Trade-off Toolbox: Balancing Agility and Technical Debt
What attendees will learn
How the technical skills needed by a software architect change throughout a system's lifecycle and how this influences the architect's success
How architects should be the champions of product quality while making the right (and timely) design tradeoffs
Continuous software evolution and delivery refers to the organizational capability to innovate and release software in fast parallel cycles, typically hours, days, or very small numbers of weeks. This requires not only adoption of more agile processes in development teams or large investments in automated release pipelines, but changes in potentially the entire research and development organization in terms of technology and practices. Furthermore, automatic live monitoring and experimentation of different system alternatives enables fast evaluation and decision making. Finally, best practices and success stories should be documented and exchanged.
The Workshop on Continuous Software Evolution and Delivery (CSED), which grew out of the successful RELENG and RCoSE workshops, aims to bring together the research communities of the aforementioned areas to exchange challenges, ideas, and solutions to bring continuous software evolution and delivery a step further to being a holistic continuous process. The workshop will be run in a highly interactive manner to stimulate discussions, featuring presentations of practitioner abstracts and academic papers, as well as break-out groups focusing on lively interactions to identify challenges, existing approaches and potential directions for solutions.
I see microservices as an architectural pattern or style. Some styles are well described in the literature (Roy Fielding's description of REST is an example). Unfortunately, there was no clear description of the microservices style when it became popular. And the growing buzz has contributed to the confusion because different people try to add nice-to-have characteristics to what a microservice should be. For example, based on things I've read, one might conclude that microservices should:
Have few lines of code (some say 10-100 LOC).
Not be part of circular calls across services.
Be developed by a team that can be fed by 1 or 2 pizzas.
Avoid synchronous communication (request-response calls).
Offer coarse-grained operations (though microservices are "fine-grained services").
Be run by the team who built them.
Not participate in complex service compositions.
Have the freedom to use new languages, frameworks, platforms at will.
Manage their own databases.
Be monitored by sophisticated logging and monitoring infrastructure.
Be developed by teams split around business capabilities motivated by Conway's law.
Match DDD bounded contexts.
Be dynamically discovered.
Be small enough to be rewritten in two weeks.
Thus, requirements for a microservice are all over the place, and often mention good software engineering practices known for decades.
I'd rather stick with the well-known description by Lewis and Fowler: "the microservice architectural style is an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery. "
From an architecture perspective, the microservice style belongs primarily to the deployment view. It dictates that the deployment unit should contain only one service or just a few cohesive services. The deployment constraint is the distinguishing factor. As a result, microservices are easier to deploy, become more scalable, and can be developed more independently by different teams using different technologies. Beyond that constraint, other characteristics describe a lean SOA service that does not use orchestration or ESB.
Despite the lack of clarity about what microservices really are, it's exciting to see the many ideas, tools, and discussion sprung from them. I'm not sure we're past the peak of inflated expectations with microservices, but I certainly look forward to seeing what consolidated shape microservices will have when they reach the plateau of productivity. At the moment, for us software developers, more important than having a clear definition is to understand the tradeoffs of using microservices.
According to DevSecOps: Early, Everywhere, at Scale, a survey published by Sonatype, "Mature DevOps organizations are able to perform automated security analysis on each phase (design, develop, test) more often than non-DevOps organizations." Since DevOps enables strong collaboration and automation of the process and enforces traceability, mature DevOps organizations are more likely to perform automated security analysis than non DevOps organizations. My previous blog post, Microcosm: A Secure DevOps Pipeline as Code, helped address the problem that most organizations do not have a complete deployment pipeline in place (and are therefore not considered to be DevOps mature) by automating penetration tests of software applications and generating HTML reports as part of the build process through the Jenkins CI service. In this follow-up blog post, I explore the use of a service evolution of Microcosm as a simple one-stop shop for anyone interested in learning how to implement a DevSecOps pipeline.